Rachel Botsman, DLD Munich 2019

Rachel Botsman: Why Innovation Depends On “Leaps of Trust”

Trust enables new ideas to travel. Without it, we’d never get behind the wheel of a self-driving car or let a smart assistant into our homes. In her latest book Who Can You Trust? author and Oxford University lecturer Rachel Botsman described the ways in which technology is changing the nature of trust and how we can adapt to a world of trust on speed. At DLD Munich 2019, Rachel will be sharing her latest research on the relationship between trust and innovation and what happens when trust breaks down.

Many people have lost trust in tech leaders and companies. What do companies need to do to rebuild this trust?

Well, I’d start by changing the language that we’re using. When we’re talking about “rebuilding trust” it sounds like trust is something physical, that can be measured, bought and controlled in conventional ways. But we need to view trust as a human feeling that can only be earned over time through being more trustworthy. This is not achieved through grand gestures, senate hearings or ad campaigns. We earn trust through small repeated actions and behaviors. The good news is that there is a science behind what makes a person or organization trustworthy, so we have sort of a roadmap to follow. A four-part formula made up of competence, reliability (the how traits) and integrity and empathy (the why traits). To move forward, tech leaders should be focusing far more on character traits, particularly integrity. It really comes down to continually demonstrating that the company’s intentions and incentives are aligned with the wellbeing of society, individuals, and particularly their users. Easy to say but incredibly hard to do at scale.

Why are technology and innovation so dependent on trust?

Just as money is the currency of transactions, trust is the currency of interactions, enabling new ideas to travel. When people take a risk to do something new or in a fundamentally different way that would once have been unthinkable, we take what I call a “trust leap”. We don’t actually know for sure what the outcome will be. Take the first time we put our credit card details into a website or our first online dating experience: these were trust leaps. Trust leaps create new possibilities; they carry us over the chasm of fear, that gap between us and the unknown. That’s why when trust breaks down it’s such a problem for innovation. Not only does it wreak havoc internally in terms of employee’s confidence in taking risks, but it becomes harder for companies to launch new products, to venture into the unknown. We don’t feel safe with them. That’s the heart of distrust.

New technologies like AI hold great promise, but how do we benefit from digital health, voice assistants or face recognition while also protecting our privacy?

AI has major trust issues – some perceived, others warranted. Take smart home assistants. Study after study shows many people believe the technology is always listening – even when they have not been given a command. A recent poll by Accenture in the UK showed that 55 percent of users are frightened of being hacked or having their personal details stolen. The real issue is not whether we will put these devices into our homes but whether or not we will use them to their full potential because we don’t trust them enough. If we don’t fix AI trust issues, particularly in relation to privacy, we’ll be still using AI to check the weather forecast, perhaps, or to ask random jokes about pirates when they are capable of hundreds of other more useful functions. Bridging the trust gap really comes down to security and integrity. We need to believe our information won’t be hacked and used inappropriately but above all else, we need to feel confident that the technology (and its providers) has good intentions – our best interests at heart.

What topic deserves more attention in 2019?

Top of my list would be much deeper research on children, mental health and tech addiction. We’re hearing such mixed messages. And the research needs to follow with clear actions from schools and practical wisdom for parents beyond “turn the devices off”. I’d put gender and trust as the second topic. Very little research has been done exploring whether there are gender differences in the way we trust in digital environments. The implications could be massive. Third would be the speed of tech cultures. There is a tendency for tech cultures to be driven by efficiency, speed and growth. (The Facebook mantra of “moving fast and breaking ” is a perfect example.) We need to radically rethink how we can create tech cultures that create the space and permission to make ethical choices that really are in the best interests of users, even when it seems really hard or like it might slow us down.

Rachel Botsman, DLD 2019, Trust

The Currency of Trust

In her DLD talk Rachel Botsman describes the importance of trust in business and daily life.
Watch

By loading the video you agree to the privacy policy of YouTube.

magnifiercrosschevron-downmenu-circle
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram