Privacy is one of the biggest issues facing payments. Individuals, businesses, and governments confront decisions on how data is gathered, stored and protected, and used.
We all generate huge amounts of data every day as we shop online or in stores, browse the Web, or even just move around with smart phones in our pockets. Payments companies, retailers, advertisers, and others want to collect that data, analyze it, and use it.
We risk ending up in a payments panopticon. A panopticon is a form of prison where all the cells are arranged in a circle with guards at the center who can see into every cell. As payments, social media, and data analytics converge, every aspect of our lives could soon be viewed through the payment tools we use.
The scale of data collection and sharing is so large and complex that we can no longer protect ourselves against data compromises or manipulation. For example, we can’t know if malware has been installed on a card reader or if companies are safely storing and handling our data.
An example of this was the trove of data uncovered in October on an open Web site by Data Viper. They found information on 1.2 billion individuals, including names, email addresses, phone numbers, and LinkedIn and Facebook profile information. It appeared this data had been gathered from public sources and combined with information bought from a data aggregator.
There is no way we can forbid a data broker from selling information it has gathered or prevent a screen scraper from pulling data off Web sites. We also can’t force companies not to make security mistakes like storing our data in the clear.
As well-meaning companies develop tools to bring things like social-media associations into lending decisions, we are rapidly approaching a network that will rival China’s social-credit system. It’s an ugly view of the future to think that we could end up in a social panopticon of our own making where our every move is scrutinized.
Facing the threat of hacks on one side and privacy violations on the other, our only options are to minimize our interactions with companies we don’t trust and shrink our digital footprints. But it’s impossible to recall data that has been released into the datasphere.
Legislators and regulators are looking to provide some defenses. There are many privacy bills in the U.S. Congress. Many focus on specific topics, such as genetic information, Internet tracking, and tracking passengers on planes. A number of states are considering privacy bills. The only enacted law is the California Consumer Privacy Act (CCPA), which goes into effect in July. However, bills in other states are so numerous that California won’t be alone very long.
Regardless, payments companies have a vested interest in strong privacy protections. They can protect themselves from reputational, regulatory, and financial risk by making sure that data is secure and shared judiciously.
An open-source effort called The Digital Standard is working on best practices for data security and privacy. It recommends that products have the highest level of data privacy by default and require customers’ affirmative consent to all data sharing. The Standard’s Web site has recommendations for data security and data privacy and includes procedures for testing both kinds of programs.
The cornucopia of data provided by people’s digital footprints offers opportunities for analyzing, sharing, and selling that data. Nonetheless, every opportunity has a cost. The cost in this case is that companies must pay attention to privacy or risk having products and services abandoned by people who don’t want to be exposed for commercial gain.