© FT Montage

Lots of things in life require us to trust others: hiring a babysitter, taking medicine and buying a second-hand car online, to name a few. As in so many other areas, technology is transforming trust-based transactions, but it has also created new problems of its own.

When we were all muddy-faced peasants living in villages, we relied on gossip to know whom to trust. When life became more sophisticated, we created expert institutions to build societal trust. Now, thanks to technology, we outsource much of our judgment to total strangers, who rate products, drivers, apps, employers and people’s spare rooms online. This third trust revolution has underpinned the emergence of huge new digital businesses, such as Alibaba, Uber and Airbnb.

Connecting everybody online creates a powerful “hive mind” that can capture the wisdom of crowds. But one of the troubles with these models of distributed trust is that they are themselves too trusting — they do not pay enough attention to original sin. They also open up digital platforms to criminals and propagandists. Trust may be the currency of the sharing economy; sadly, it has already been debased.

Two Financial Times investigations this month have highlighted how trust can be manipulated on even the best-resourced tech platforms. The first showed how Amazon’s online marketplace in the UK had been infested with fake reviewers giving five-star recommendations to little-known Chinese products. Following the investigation, Amazon deleted 20,000 product reviews.

The second revealed how highly organised “ratings farms” have manipulated rankings on Apple’s App Store. Boosting the rating of an app above four stars can have a big impact on usage on a platform that facilitates more than $500bn of commerce. “The problem is that the truth gets rained on,” one industry expert said. 

Similar trust problems have plagued other platforms. Uber has had rogue drivers, Facebook political propagandists and Airbnb fake customers exploiting loopholes in automated systems.

Rachel Botsman, the author and academic who was one of the earliest enthusiasts about the power of distributed trust, acknowledges that the revolution has so far failed to deliver on its potential. “The promise is still there, but the mechanisms are wrong and clunky,” she said.

It is undoubtedly hard to deliver trust on an industrial scale involving many millions of online interactions. But Ms Botsman highlights the positive examples of distributed trust that have emerged among small groups of local users on WhatsApp or Nextdoor helping out neighbours during the coronavirus crisis. “Where distributed trust works best is not in a global marketplace but where it is hyper-localised,” she said.

That suggests that our village mentality remains strong. Personal connections, word-of-mouth recommendations and societal sanction for the untrustworthy can all play a vital role in building online trust, too. The answer may be to try to integrate all three of the models, combining individual recommendations with institutional trustworthiness and collective wisdom. This combination can create powerful knots of trust.

One business that is attempting to do just that is UrbanSitter, a San Francisco-based start-up focusing on the babysitting market. Sitters register with the service and are checked out by company staff, who scour personal submissions, criminal records and social media posts. In return, families are scrutinised to ensure the safety of sitters. Both parties are subject to two-way appraisals and performance measures.

The company, founded in 2011, has facilitated more than 1m bookings across the US, creating a nationwide web of referrals. “All this builds up a much more trusted ecosystem,” says Lynn Perkins, UrbanSitter’s founder and chief executive. “We are trying to give you word-of-mouth trust in an easy-to-use service.”

It may be a similar story with blockchain — or distributed ledger technology — perhaps the ultimate distributed trust model. There are perhaps some workable, if limited, examples of trusted institutions creating “permissioned” blockchains between themselves to expedite share transfers, for example. But wholly “permissionless” decentralised blockchains, championed by the technology’s purists, can generate considerable risks.

Trust cannot, and should not, be purely automated. Humans have to remain in the loop.

john.thornhill@ft.com


Get alerts on Technology when a new story is published

Copyright The Financial Times Limited 2020. All rights reserved.
Reuse this content (opens in new window)

Follow the topics in this article