Creating trust in an AI world

One Year Later – The ChatGPT3 Effect

AI is not just a technological tool; it's a mirror reflecting the culture and values of the organisations that deploy it. It is therefore more critical now than ever for organisations to focus on developing performance cultures that are built on trust and psychological safety.
185mn
1.5bn

On this day a year ago…

One year ago today, ChatGPT3 was unleashed upon the world, triggering a capability arms race in AI development. Millions signed up. With over 180.5 million users and 1.5 billion visits in September 2023, SaaS providers everywhere frantically sought to integrate this new technology into their offerings, and the floodgates opened for funding for more extensive, more capable LLMs. A lot has happened in one year. 

This year symbolised a journey of understanding how to navigate these new advanced statistical capabilities we call AI, a term possibly as multifaceted as human intelligence, which itself can describe anything from social awareness to mechanical comprehension. A year of excitement for many; worry for many more. 

Looking ahead to 2024, a pivotal year with significant elections globally, there’s an air of disquiet. Since their initial use in Obama’s 2008 campaign, AI technologies have increasingly influenced political landscapes. The prospect of two billion voters, two-thirds of the world’s democratic electorate, casting their votes raises the stakes immeasurably. The potential for misinformation, like an election-day deepfake, is just one concern among many, not to mention the subtle yet persistent effects of algorithmic nudging. Trust in our institutions is low, and bestselling author Yuval Noah Harari worries these will be the last truly human elections. 

Of course, new technology isn’t only affecting the political arena but is becoming increasingly relevant in our workplaces. Employee engagement, or rather the lack of it, is a growing concern. A recent Gallup study revealed a mere 15% report feeling actively engaged in their roles. This manifests as quiet quitting, silent resignation, and a pervasive doubt in leadership’s ability to act in the employees’ best interests. Into this mix, we introduce these new, disruptive technologies. 

Creating trust in an AI world

Yet, the impact of AI is not uniform across organisations. While many may resonate with these challenges, others tell a different story.

AI, in its essence, can be a powerful force multiplier. Just as these models reflect the biases of their datasets, their influence varies depending on the organisational culture they’re part of. In a negative environment, AI might accelerate job displacement and reinforce a lack of agency. Conversely, in a positive setting, AI can foster innovation, job creation, and empower individuals.   

A significant number of CEOs, 39% to be precise, believe their companies will not survive beyond a decade without significant change. Here, AI offers a beacon of hope for adaptation and longevity for those ready to embrace it. After all, for many, the risk of losing a job because their organisation goes out of business far outweighs that of losing it because their work can now be automated. 

The key differentiator between these divergent outcomes is trust. This isn’t a new concept in leadership but one that’s gaining a new dimension in the age of AI. Effective leaders today are those who embody honesty, reliability, and accountability, coupled with a willingness to learn how these technologies work and, thus, what opportunities and risks to look out for. This goes by many names, none perfect, yet one best summed up as data-driven leadership. 

There is no 'Attenborough of AI'

The coming years will be pivotal for organisations, requiring much thought about how to best deploy these innovations. Strategic partnerships, guardrails and operational pivots will do much to define an organisation’s future. Leaders must navigate these choices while capitalising on, rather than alienating, their greatest asset: their people.   

Many crucial decisions stand between where we are and the eventual success of our data and AI strategies. There is no David Attenborough of AI, no single source of authority we can look to for all the answers. The solution instead lies in developing organisational leaders who can understand both the technological landscape and the deep cultural context that landscape exists within. 

2024: A year for the wise and the technologically savvy

At Corndel, we partner with organisations to develop these skills at each level within the organisation. Our Data-Driven Leadership programme, in partnership with Imperial College London, is designed specifically to develop the critical data skills managers need to make insight-led decisions, communicate them effectively and build data-literate teams. Empowering leaders and managers with the ability to navigate these important changes. 

AI is not just a technological tool; it’s a mirror reflecting the culture and values of the organisations that deploy it. It is therefore more critical now than ever for organisations to focus on developing performance cultures that are built on trust and psychological safety.

Leadership in this new era is not about discarding traditional values but augmenting them with technological understanding and foresight. It’s about building a culture of trust where AI can be a tool for innovation and progress. For those willing to learn, adapt, and lead with a blend of wisdom and technological savvy, the future is bright. 

Written by Jake O’Gorman, Director of Data Strategy at Corndel, the strategic workplace training provider.

You may also be interested in…