In the Age of Trust, ‘trust score’ is the KPI you need

We seem to be in the Age of Everything: the Age of the Customer, the Age of Artificial Intelligence, the Age of Data, the Age of Digital Transformation, the Age of Disruption, and so on. If there is one thing these ages have in common, it is their focus on delivering memorable customer experiences. But with a lack of integrity, experiences are destined to vaporize. Welcome in the Age of Trust.

In the Age of Everything, data is a vital component. The available amount is abundant, and growing exponentially. In fact, about 90% of all the world’s data was created in just the past two years. And that is providing valuable insights, helping brands for instance, to develop and target personalized services to your liking. At the same time, however, this data causes friction. We all know that tech giants like Google and Facebook seemingly offer their services for free, while in fact we pay for these with our privacy. It is no secret that these companies probably know more about you than your own mother or spouse.

Privacy concerns

If you like, you can see for yourself. Based on the groundbreaking works of Carl Gustav Jung, you can use CrystalKnows.com to check anyone’s personality! The company uses various data sources, and claims an 80% accuracy in doing this. Of course, they too collect your data in return. You might want to think twice before sharing your genetic details for a DNA match on websites like MyHeritage, Ancestry, or 23andMe.

Scandals and data breaches have resulted in a growing public awareness, and the need for stricter regulations like GDPR and PSD2 . You’ll probably recall the political profiling by Cambridge Analytica during the 2016 US presidential campaign. People are starting to see the downside of their digital footprint, and increasingly value their privacy over algorithmic personalization. I wouldn’t be surprised if some tech giants will be facing mega claims, as seen earlier in the tobacco industry. Unless they manage to drastically change their business models. We are now willing to pay for ad-free experiences. Someday soon, we might pay for our digital anonymity as well. Privacy by design is becoming the new standard. This week, Mark Zuckerberg finally announced Facebook’s “first step in building out a privacy-focused social platform”.

Living in a bubble

There’s another downside to data driven personalization: becoming biased. Profile based newsfeeds and tailored advertising help us to conveniently find our way in an increasingly complex world. But they may also keep us from being exposed to different perspectives and alternative options. Without realizing it, it becomes harder to listen without prejudice, remain open-minded, and distinguish fake news from real . Finding the truth is getting even harder with the rise of so called click farms , where you can buy thousands of fake reviews and likes for just a few dollars. RAND studies the Truth Decay , which they define as “the diminishing role of facts and data” (in American public life). There is no single truth – we each have our own.

In his 2017 farewell speech , Barack Obama spoke about people retreating in their own bubbles as being a threat to democracy:

“For too many of us, it’s become safer to retreat into our own bubbles, whether in our neighborhoods or on college campuses, or places of worship, or especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions. The rise of naked partisanship, and increasing economic and regional stratification, the splintering of our media into a channel for every taste — all this makes this great sorting seem natural, even inevitable. And increasingly, we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there.”

Ethical behavior

The massively increasing use of artificial intelligence (AI) raises ethical questions too. Should a self-driving vehicle, for instance, decide to kill a child or an elderly person , in case a collision with either is inevitable. Through AI and deep learning, we’re not far from achieving technical singularity – the point where computers have become smarter than human beings. And that can be applied for the good and the bad. Think of eradicating poverty and diseases, but just as well autonomous weapons that are beyond human control. In 2015, over 8.000 scientists, including Stephen Hawking, Elon Musk, Steve Wozniak and Eric Horvitz, signed the open letter on artificial intelligence . It called for research on ensuring that “AI systems must do what we want them to do”. As with any powerful tool or weapon, we need to trust who or what controls it. The 2018 documentary Do you trust this computer? dives right into this matter. In this film, a scientist states: “It stands to reason that whoever has the best AI, will probably achieve dominance of this planet.”

What happens to one’s sense of autonomy when the horror scenario of a Black Mirror episode may become reality in China ? When world leaders withdraw from nuclear and climate treaties, our faith in humanity is at stake. When we entrust our children to a babysitter, we don’t want to worry all night. And when you are being anesthetized for a major surgery, you want to feel comfortable putting your life in the doctors’ hands. Or for many, in God’s hands.

The 2019 Meaningful Brands report by Havas states that globally 84% of people think companies and brands should communicate honestly about their commitments and promises, while only 38% think that companies and brands do so indeed. Simultaneously, 77% prefer to buy from companies that share their values, and sustainability is becoming a business model.

iBeleggen, a Dutch asset management company, clearly understands what matters most to its customers. Their tag line reads “the best thing we can earn is trust”. The Blockchain was designed to eradicate unethical human interventions, by enforcing data integrity between decentralized nodes. And the man who invented the world wide web 30 years ago, Tim Berners-Lee, is now aiming to re-invent it by safeguarding our privacy with the Solid project . The Public Spaces initiative is striving for something similar.

Trust as KPI: introducing ‘trust score’

SustainableX age of trustOn who (or what) do we rely for our privacy, information, technology, assets and health? I believe we’re entering an age in which trust becomes more valued than ever. This idea isn’t new, and has been described before. Edelman publishes its Trust Barometer yearly. And a company like TrustPilot calls upon the wisdom of the crowds by enabling trust ratings.

Brands can (or even should) measure trust by adding a ‘trust score’ to their KPI framework. I have seen several definitions and calculation methods, but this long read by the Institute for PR will give you a good head start.

In the Age of Everything, it is our job to win our customers’ hearts and loyalty by earning their trust. In addition to valuable KPI’s like FTR, TCR, CES and NPS, trust score can monitor just that. So I’d like to say: welcome in the Age of Trust.

 

Updates

March 21, 2019 – The relevance of this topic for businesses was confirmed last week, in the Future of the Firm report by O’Reilly. It mentions “Trust, responsibility, credibility, honesty, and transparency” as the first big-picture item affecting the future of the firm. “Customers and employees now look for, and hold accountable, firms whose values reflect their own personal beliefs. We’re also seeing a “trust shakeout,” where brands that were formerly trusted lose trust, and new companies build their positions based on ethical behavior. And companies are facing entirely new “trust risks” in social media, hacking, and the design of artificial intelligence (AI) and machine learning (ML) algorithms.”

March 25, 2020 – On March 20, 2020, the Financial Times published an article by Yuval Noah Harari about how governments are dealing with the Corona Covid-19 crisis. In it, Harari made a strong statement on trust: “In this time of crisis, we face two particularly important choices. The first is between totalitarian surveillance and citizen empowerment. The second is between nationalist isolation and global solidarity. (…) People need to trust science, to trust public authorities, and to trust the media. Over the past few years, irresponsible politicians have deliberately undermined trust in science, in public authorities and in the media. Now these same irresponsible politicians might be tempted to take the high road to authoritarianism, arguing that you just cannot trust the public to do the right thing.”

September 30, 2020 – The new Netflix documentary The Social Dilemma taps in on the fact that we, the users, have become the product of the tech giants, and that advertisers are their clients. Interestingly, Netflix itself can be regarded a tech company that applies user data and algorithms to get their viewers hooked.

 

A copy of this article was posted on Medium and LinkedIn .

Leave a Reply

Your email address will not be published. Required fields are marked *