The initial penetration was slow. The original Diner’s Club credit card of 1949 was too different to quickly catch on. But it had a compelling message, as the historic burden of trusting the customer was shifted away from the merchant, a service for which the merchant was willing to pay.
Merchants reluctant to adopt the new cards were bypassed by consumers, and in due time virtually no business sustained itself without accepting credit cards. Over recent decades, many organizations have challenged the credit card solution and offered to help merchants by offering cheaper fees. Still, merchants were not persuaded. Nobody could compete with the gravitas of the ubiquitous cards, which, until very recently, looked invincible.
Only now a new technology has emerged offering consumers a robust answer to a rising concern: privacy. It’s a concern the credit card cannot address, while the concern itself becomes more serious by the day.
When we unleash artificial intelligence, or AI, on the full canvas of people’s payment history, we get a frightfully accurate reading of every one of us We are stripped naked, psychologically speaking.
It turns out that we all expose ourselves with respect to status, ambitions, pleasures, professions, hobbies, secrets, weaknesses, proclivities—everything about who we are. All this through the mundane list of what we choose to pay for. AI analytics compares the expenditure pattern of each of us to the patterns of all of us. Genetic algorithms build brain-like inference networks that draw clear conclusions from messy payment and expenditure data.
Imagine Visa and Mastercard getting to read the file your psychologist or therapist developed about you, and further, sharing this file with government, marketing companies, and hiring outfits. Suddenly, people will get turned down for loans and jobs that had appeared to be a sure bet. Racism and other forms of discrimination zip back in, just because your psychological profile is so revealing.
But what is really troubling is that, unlike the credit scores of today where errors can be complained about and rectified, the AI data-to-conclusion process is not understood by humans and cannot be appealed by the victims.
The credit card paradigm cannot provide privacy. Its very premise is knowledge of the customer. Digital money, on the other hand, fits the bill. Various banks operating a digital-money solution, such as BitMint, would mint privacy-protecting digital money and would sell it as a product for markup of 0.5%. The anonymous buyer could then walk into a store and buy what he secretly likes, paying with BitMint. The merchant would instantly redeem the digital money against nominal dollars (at a 0.5% fee) and hand over the merchandise.
Neither the merchant nor the bank would have any clue as to who bought this merchandise. The same is true for books that might expose embarrassing interests, or medical devices that imply private illnesses. Identity-masking services will sprout, allowing people to shop online and receive merchandise anonymously.
Merchants resist new payment solutions, but, on top of the much lower acceptance cost, the public will demand it. Travelers to faraway places are reluctant to surrender their credit card information to a small vendor in a foreign land—no such worries with digital money.
The battle over privacy and dignity in cyberspace is the battle royal in the payments world for this century. Our very sense of well-being is tied to our ability to pull a veil around our most intimate transactions. If our expenditures remain exposed, there is little that is left hidden—protected from judgment, exploitation, and unfixable errors.
—Gideon Samid gideon@bitmint.com