Are you being misled by AI claims? Helen Marshall delves into the concept of AI-washing and its implications
With the rise of AI-powered solutions it’s more important than ever to adopt a critical thinking approach. In mid-May, OpenAI released a new and updated version of ChatGPT: ChatGPT-40. It offered things like conversational audio responses, better image recognition, and more human-like qualities akin to a coach. Less than 24 hours after its release, a number of well-known learning technology products started to advertise that they were now using this newer version in their products.
AI-washing refers to the practice where companies misleadingly tag their products with the label of artificial intelligence
And while this sounds exciting, and shows that they’re willing to make moves on new technological advances, the broader market sentiment was: How much thought has really gone into this?
And once you start down that route, it’s easy to get to a place of mistrust for how those product developers are then handling your data. Yes, AI advances are happening quickly, but that doesn’t mean products should be responding as fast.
The buzz vs reality
In many of the recent conferences, such as Learning Technologies or ATD, the vendor exhibition space has been filled with excitement around AI. And rightly so. It is disrupting our user experience and changing the way we approach learning at work.
Imagine the excitement of a corporate L&D team at the showcasing of a new learning technology product, boldly advertised as powered by ‘advanced AI’. With promises of personalised learning paths and unparalleled engagement metrics, it’s easy to see why it might be enticing.
However, on deeper investigation (and hopefully before implementing the tool…) the AI’s capabilities appear to be nothing more than a glorified random number generator, incapable of truly adapting to learner needs or providing any tangible analytics. This scenario is not just disappointing but a real example of a rising trend in learning technology: AI-washing.
What is AI-washing and why should we be cautious?
AI-washing, analogous to the more widely known ‘greenwashing’, refers to the practice where companies misleadingly tag their products with the label of artificial intelligence to capitalise on AI’s current market appeal. This often involves overstating the role or sophistication of AI in a product when, in reality, the technology does not utilise true AI algorithms or does so in a superficial way that adds little real value to the product.
Ultimately, AI is a marketing strategy that attracts potential customers to a product. What is evident in L&D at the moment is a real confidence and capability gap in teams being able to think critically about AI because teams largely don’t have a strong base-level understanding of what they’re looking for – or worse – why they’re even looking for it in the first place.
The number of request for proposal (RFP) documents that ask: “Does your platform utilise AI?” with no explanation for why this is even a question, is baffling.
If there’s a problem with a general lack of sophisticated knowledge about AI – that specifically extends beyond Large Language Models – in L&D, then there’s a potential for technology products to lean into this fear with their marketing. Not only are people feeling vulnerable because things seem to be moving fast, but this fear is amplified through a general lack of skill in AI, too. It’s a double-edged sword.
Start with the problem
Any learning technology product worth its salt will be starting with the problem first. What are they seeing across the industry, relevant to their market segment, that they can help to solve with AI?
Problems could range from ‘learning isn’t relevant enough to specific roles in my business’ to ‘how can I predict what I need to do next?’ or ‘how do I upskill faster based on my goals?’ These are quite broad statements, but once you start from a place of real-world issues, it’s much easier to understand how technology may (or even may not) be the solution.
Suppliers who start with a problem-first mindset, and aren’t attracted by the new shiny updates, are the ones worth paying attention to. Product development takes time, and while it can be expedited, it shouldn’t be rushed – particularly when you’re dealing with data.
Think critically
In the face of new technology advances, one of the themes in conversation is leaning into our more human skills – the things that AI can’t take from us. One of those such skills is critical thinking.
For L&D professionals, the allure of AI-driven products can be strong, propelled by the promise of cutting-edge technology solving complex business challenges. However, it is crucial to maintain a healthy scepticism alongside a curious optimism for what these products can do.
Professionals should look beyond the marketing buzzwords and demand clear, understandable explanations of how AI is used and the specific problems it solves.
To foster this level of critical thinking it might be worth considering some of these questions:
* What specific AI technology does the product use?
Understanding whether the product uses machine learning, deep learning, or just simple algorithms that mimic intelligent behaviour can shed light on its actual capabilities.
* How does AI enhance learning outcomes?
Any claim of AI should be accompanied by clear evidence or case studies showing how it improves learning, retention, or engagement.
* Is the AI ethical and transparent?
It’s essential to consider whether the AI operates in a black box or if its decisions and processes are transparent and explainable, which is crucial for ethical educational practices.
* What data is the AI drawing from?
This is probably the most important question to ask because it will help you determine whether your data is being used correctly, and whether you are able to feed it your correct business data safely.
* How does the AI solve my problem?
If the problem you are trying to solve has been well established, then asking this question will be easy. If it’s not, you might need to go back to the drawing board.
Focus on problem solving
One fundamental shift that needs to occur in L&D procurement strategies is the focus on identifying the core learning challenges and needs before seeking technological solutions. By understanding the specific problems that need to be addressed, L&D teams can better assess whether AI is a suitable solution and avoid being swayed by AI-washed products.
At the end of the day, if you work in the learning or people development spaces, you should be a problem solver.
Problem solving is crucial because it directly impacts your ability to navigate challenges, innovate, and remain competitive. Effective problem solving leads to better decision making and resource management, reducing downtime and waste, while also maximising efficiency and outcomes.
If you can hone the skill to anticipate potential issues and respond proactively, rather than reactively, you’ll begin to foster a more adaptable and resilient organisational culture. Ultimately if you have adopted this mindset, then you’re setting yourself up to think more critically about the sort of methods and tools you’re bringing into the business to aid your own problem solving capabilities.
These are all things that are important to L&D teams. And. ultimately, if you have your mind set on impact, you’ll be thinking about how whatever you’re rolling out in your business is impacting the bottom line.
Have you fallen victim to AI-washing?
Don’t worry, you’re not alone. It’s never too late to start asking deeper questions. And don’t be afraid if you don’t know the answers to some of the questions you might get asked in return. It’s a learning curve for everyone and part of adapting to new technologies is also about having a growth mindset, and ultimately understanding that you might need to change the way you approach things.
As the learning technology market continues to grow, the temptation for companies to engage in AI-washing is likely to increase. For those in L&D, it is imperative to develop a keen eye for these practices.
By starting with clear, defined problems and maintaining rigorous standards for technological solutions, you can ensure that technology is being used to genuinely enhance learning rather than falling for the allure of unfounded claims.
A last piece of advice
Look for a supplier who can support you to dig deeper, is able to answer your questions – or, better yet, is transparent when they can’t – and you’ll develop true partnerships based on trust and mutual respect, not marketing waffle.
Helen Marshall is Chief Learning Officer at Thrive