What happens if next time you ask Alexa to order detergent, “she” not only picks the vendor that most profits Amazon, but also prices the product based on what Amazon knows about your financial-pain points and then accesses your Amazon financial account via an Amazon payment service in ways that leave you the only participant in this transaction who gets all wet if anything goes wrong?  This is among the questions we answer in a new paper finding that the transformation of consumer finance through new tech services has tremendous potential, but also profound peril, especially to consumers without the financial resilience, educational background, or just patience to understand the huge differences between regulated providers and all the ambitious newcomers gunning for them.

It’s not, of course, that regulated companies put economic equality before profit – they don’t.  It’s also not that regulators always meaningfully discipline errant providers when a bank or similar entity harms vulnerable households.  However, all of the rules and all of the regulators do ensure that numerous protections are almost always built into consumer-finance products, that resources are on hand to handle complaints and mistakes, and that capital and liquidity can be claimed if customer funds go missing, identities are breached, servicing is faulty, or large fines are imposed.  The rules may not be tough enough, the company isn’t always ready to help, or the fines may be too small, but at least a structure exists for consumer protection and redress.  Tread beyond the regulators’ reach, and it’s a lot more dangerous out there.

Further, many statutory safeguards – FDIC insurance, $50 loss limits, and so forth – have applied so long to regulated products that most consumers think it’s the product that’s regulated, not the provider.  When an unregulated provider offers a like-kind card, payment-service or even deposit product, most consumers will think the product is risk-free even if they are in fact putting their financial well-being on the line.

All of these differences are called regulatory asymmetry which leads to regulatory arbitrage which leads to trouble.  The counter argument – of which I heard plenty following the paper release – is that service agreements and other self-regulation suffice or even beat mandatory regulation.  But, as we note in the paper, it would take you at least four months of eight-hour days to read all the service agreements for the tech products most households now use.  Even if you read them all, would you then know if a company changed the agreement or how using one tech service with an acceptable agreement links to another with one that should give you the willies? 

We noted in our paper that tech firms integrate services without giving users any rights over how data are used.  Germany has just barred Facebook from using data collected on third-party apps and websites or even from its own services to a consumer’s social-media account without the consumer’s prior consent.  What’s more, Facebook can’t block a personal account even if the consumer denies it access to his or her WhatsApp or other service.  No such protections apply in the U.S. so none of us knows what happens to our data.  What will Facebook do with them when it’s new remittance service is up and running?  If it gets the access to bank data for which it has asked?  When it comes up with something new along the lines of a new bank in South Africa that uses data to price loans so only healthy people get credit?

There are clear privacy implications to shared data – as the Financial Times noted yesterday and our paper details, “User data is the most valuable commodity for most firms, but for consumer-facing platform technology companies, it is the only thing that matters.”  But, as several of you asked after our paper came out, where does economic inequality come into this?  After all, isn’t it as important to me that my personal data are breached as it is to a less-affluent consumer?

In short, no.  The more financial products are inextricably or even invisibly linked to commercial transactions, the better firms will be able to target customers in ways that may well enhance profitability, but also discriminate on the basis of financial status, race, ethnicity, neighborhood, gender, disability, political perspective, or even how much you work out.  With all this data power, the better able firms will also be to offer commercial products via the notorious “buy-now, pay-later” pitch.  These deceptive marketing tactics have long jacked up the cost of cars, windows, and roofs to vulnerable households.  What happens when they are used also for equality-essential financial services such as access to the payment system, mortgages, or deposits?

Finance has never been fair and banks aren’t exactly stewards of the downtrodden.  But the power of big data in the hands of fast-changing machine-learning systems constructed to maximize profit outside the reach of virtually any governmental edict poses access, cost, and conflict questions never seen before in U.S. consumer finance.  As our paper says, it’s essential to anticipate these equality challenges and curtail them now.  We’ve learned the hard, hard way how impossible it is to remedy the damage to household wealth or even to national financial stability if critical financial products come at undue cost to the most vulnerable among us.  If my detergent costs a bit too much or comes with a seemingly irresistible offer for a washing machine, I may be misled, but I’ll manage.  But, when forty percent of Americans are only one paycheck away from poverty, even a little off the top makes all too much of a difference to far too many households for tech finance to explode without built-in safeguards from the start.