The power and pitfalls of AI

30th September 2024 |   Sue Turner OBE

The power and pitfalls of AI

Good governance and a thought-through approach are essential if we are to use AI responsibly

I was seven years old when I made my first sale. The family business provided musical education concerts for children, so in the school holidays I was press-ganged in to help. I quickly found that being front-of-house selling merchandise was much more fun than being backstage, cementing my lifelong love of making a sale.

Back then, my sales engagement patter was simple: “Do you know the difference between the records?” It was an effective closed question, but now seems desperately oldfashioned in a world where we have so many different tools, technologies and routes to market.

The technology grabbing the headlines today is of course artificial intelligence (AI). AI can analyse vast datasets to automate lead generation, predict customer preferences and hyper-personalise marketing messages. While this technological leap forward is undeniably impressive, it also raises profound questions about the future of human interaction in sales and the ethical implications of wielding such power. Research by AI Governance, however, found that 91% of organisations had no controls on their AI use so it’s not surprising that businesses are hitting the headlines for making avoidable mistakes with AI.

How to use powerful AI tools well

All AI systems rely on large amounts of data to function effectively. When we are dealing with customer data, we need to be particularly sensitive about how this data is collected, stored and used, as AI can open up new risks for this sensitive information. For example, we could collect a person’s browsing history, purchase history and social media activity to create a detailed customer profile. This information could then be used to target the existing or potential customer with personalised advertising or sales pitches that exploit their vulnerabilities or passions.

The technology exists to let us do this, but is it desirable? At what point does this move from being good business sense to becoming manipulative? And in your organisation, who decides where that line sits? I recommend all organisations use a “Responsible AI” (RAI) framework to guide them through the minefield of options. One of the key tenets of the AI Governance RAI framework is transparency (see Figure 1)

AI Governance RAI framework.
Figure 1: AI Governance RAI framework.

In the example described above, we should consider telling people that we are profiling them in this way. If your team is not willing to be transparent like this, fearing that people would find it intrusive and be turned off dealing with you because of how you have collected and used their data, then that gives you a strong indicator that you should not be doing it.

Bias in procurement

It can be difficult to understand how AI systems reach their conclusions, particularly with complex algorithms, but we should endeavour to explain how the tools we use work. If we can’t, it makes it difficult to avoid bias.

AI systems are trained on past data which may contain biases that can lead to discriminatory outcomes. For example, an AI-powered procurement system, based on historical data, might prioritise bids from suppliers located in certain countries or with a history of working with the company, even if other suppliers offer a better deal. This can perpetuate existing inequalities and limit competition. To mitigate algorithmic bias, companies should ask more questions about how diverse and representative the data is that is used to train their AI systems.

While this technological leap forward is undeniably impressive, it also raises profound questions about the future of human interaction in sales and the ethical implications of wielding such power.

The mantra to play on repeat when thinking about how to use AI in your organisation is: “Just because we can, doesn’t mean we should.” We could use AI-powered chatbots and virtual assistants to increase efficiency, handling large volumes of inquiries simultaneously. I have seen these systems used well in financial services to give customers instant responses and analyse the conversation to find trends.

But AI systems can lack empathy and nuanced understanding, potentially leading to impersonal interactions, so should we use them? We don’t want our customers to become frustrated if the AI cannot resolve their query but is this disadvantage outweighed by the positives for other customers and the advantages gained by the business?

Many companies are experimenting with using AI to record, transcribe and analyse virtual meetings; however, in my experience, they rarely give enough thought to risks, like how commercially sensitive information is being stored and who can access it. Before operationalising any AI system that interacts with people (whether internally or externally) we need to investigate the potential impacts on people, assess all potential risks and be prepared to pull the plug on what might look like a great idea if we can’t control the risks sufficiently.

Will AI affect my career and job prospects?

All of us have questions about how AI may affect our jobs. In sales, as AI automates many tasks currently performed by human salespeople, there is a risk of job losses; however, I don’t foresee AI replacing human salespeople entirely. Instead, the sales role will evolve, with a greater emphasis on building relationships and using technology well to spot trends and opportunities.

The pace of change in AI is extraordinary, so employers must invest in training and development to equip their sales teams with the skills needed to thrive in an AI-driven world. As individuals we also have responsibility to keep up to date with what AI can do.

Sales professionals are increasingly likely to be negotiating against algorithms, requiring a shift in approach.

Make sure you understand, for example, that ChatGPT is not surfacing truth from somewhere. When you ask a generative AI tool like ChatGPT, or another Large Language Model (LLM), to answer your question, it is looking at patterns in vast amounts of data and putting together a response that is probably a good response to your query. The answer it gives can be untrue – a so-called hallucination – and even when you ask the LLM to provide sources for its output, you need to check the sources actually say what the LLM says they do. Investing in your understanding and skills is vital to keep your career on track.

Where there are going to be job cuts, I encourage companies to reskill or upskill employees whose roles are affected. This can lead to companies retaining people in different roles, keeping the knowledge they have built up over many years.

The power and the pitfalls of AI

Changing dynamics in procurement

Selling into structured procurement processes is already being reshaped by AI, changing the dynamics of buyer-seller relationships. Walmart has been using AI to draft and negotiate supply contracts, with algorithms able to surface detailed items in contracts that might otherwise go by the board. They did not seek to shift all the power onto their side. Instead, they aimed to create win-win situations for suppliers: for example, shifting payment terms in a supplier’s favour in return for other gains for Walmart.

Sales professionals are increasingly likely to be negotiating against algorithms, requiring a shift in approach. Firstly, understanding your customer’s data culture is essential. What data is important to them? What key performance indicators will be used to assess your product or service’s value? Are there system compatibilities or new automated reporting requirements to take into account? Can you showcase your company’s data-driven capabilities to add new insights of value to the customer?

When you are procuring AI tools, remember that you have power and responsibility in this relationship too. For example, the environmental impact of AI is often overlooked but it matters. Training and running AI systems requires significant computing power, which can contribute to climate change through energy and water consumption. Companies implementing AI in sales should ask questions about energy efficiency and push developers and third-party suppliers to reduce environmental impacts.

Conclusion

AI has the potential to revolutionise the sales industry, but it is essential to approach its development and implementation with caution. Over time, we will see more industry-wide guidelines and standards providing a framework for responsible AI use, and audits will follow to ensure organisations are up to scratch. In the meantime, if you want to build trust with customers and stakeholders, make sure your organisation has a structured way to think through the ethical minefield so you can harness the benefits of AI while minimising risks.

In the end, the art of selling remains fundamentally unchanged. Whether it’s a child selling LPs or a seasoned professional leveraging AI, the ability to connect with customers, understand their needs and build trust are still the keys to success.