Three checkpoints when using AI in designing learning

Can a machine-human partnership elevate training efforts? Nic Girvin offers some advice using DEI as her example

Can artificial intelligence (AI) really add value to diversity, equity and inclusion (DEI) training experiences, when it lacks emotion, empathy and the ability to understand the complexities of human nature? With AI so glaringly inhuman, how can this machine resource contribute to an organisation’s behavioural change and cultural transformation?

The unfortunate reality – despite extensive evidence arguing the necessity – is that DEI is often not seen as business critical. And, in these times of financial restraint, it’s the first budget to cut, leaving internal DEI teams understaffed and expected to do more with less.

Unfortunately, AI can’t be the singular saviour to internal efforts. But, when used alongside subject matter experts, AI can be a tool that boosts a learning team’s productivity and enhances that wonderous human potential. But don’t get carried away just yet. Mindful, ethical and responsible use of AI must be supported by well-established quality assurance measures – it must be grounded and focussed in particular on quality and safety.

Make use of the natural strengths possessed by your team members – particularly their curiosity, creativity and critical thinking

Let’s explore these checkpoints in greater detail – with a few benefits and risks from currently available AI application.

1. Quality – is it sensible, specific and interesting?

AI can write anything you ask it to, yet it lacks the human touch and emotional richness that originates from a human creator. This is why the quality checkpoint encourages you to delve deeper than the fundamentals and enhance your offerings with that personal touch.

It’s storytelling – but not as we know it

Storytelling connects us to our common humanity and research has revealed that, because we’re naturally inclined to make connections through narratives, we are far more receptive to diverse viewpoints and concepts when this information is presented through stories. Whilst AI can write scenarios, these narratives often present the account in terms of dispassionate observation without any consideration to what is happening ‘behind the mask’. In DEI training, this misses a huge learning opportunity. So use AI to create scenarios, but enhance that content with human lived experience because it’s the emotion, the internal struggle and the very real reality that takes a tick-box topic to a transformative change.

Free up specialists to specialise

By using AI tools to handle time-consuming yet fundamental tasks, such as crafting quizzes and scenarios, learning designers can redirect their efforts toward their core strengths – crafting impactful learning experiences that truly matter. With AI-enhanced video and virtual reality (VR) generation platforms, learning designers can now escape lacklustre text-based case studies without having to acquire video-creating expertise. Platforms like Vyond and Synthesia mean instructional designers can liven up their exercises by building diverse character interactions that allow learners to practise the required inclusive behaviours, without the fear of getting it wrong or causing harm.

2. Safety – is it free of bias, hate speech, violence etc?

If generated content is created from biased data, it risks exaggerating stereotypes and prejudiced beliefs. The safety guardrail reminds us to carefully validate AI-created content, removing anything that might be harmful or make people feel targeted because of individual characteristics.

Decode data bias

Whilst there are a host of AI analysis tools to identify human biases, when it comes to learning creation, remember that AI systems learn from training data that may contain biased human decisions or historical and social inequities, even if we remove sensitive variables like gender, race or sexual orientation. The truth is that data can link to these characteristics in non-obvious ways (like occupation) and these proxy variables can result in AI replicating discriminatory patterns. For example, if your in-house learning and development team relied solely on AI tools to write their gender equality content, some of the data is likely to be pulled from biased patterns – meaning the content may reinforce stereotypes, such as women are better suited to nurturing roles or administrative tasks, while men are more suited to leadership positions or technical roles.

Sense check stereotyping

With the growing popularity of AI image creation tools such as MidJourney, Stable Diffusion and Dall-E, it is important for instructional designers to be aware of the stereotyping on display when it comes to artificial representation. Bloomberg’s Stable Diffusion study created 300 portrait images across 14 different professions – revealing that AI represented people with lighter skin tones for high-paying jobs, whilst prompts like “fast-food worker” and “social worker” led to the creation of portraits with darker skin tones. The gender review showed a similar pattern, where most occupations produced male portraits, but low-paying jobs like housekeeper and cashier presented more females.

3. Grounded – is it supported by authoritative sources?

The grounded guardrail encourages us to ensure that AI-created content comes from sources we can trust. This checkpoint also serves as a crucial reminder to tap into the expertise of DEI advocates, marginalised communities and subject matter specialists when authenticating learning content.

Personalise at scale

Personalised learning paths are a significant enhancement brought about by AI. DEI learning needs are personal. There are hot topics and burning questions that we want to know more about but struggle to determine fact from fiction – especially when it comes to an overwhelming expanse of social media and search results. AI tools can compile reading material and pertinent learning resources simply by having learners indicate their areas of interest. The important value to place on this benefit is that, whether the learner or the learning designer is using AI to source material, the emphasis must be on the individual using this offering as a starting point. As humans, we have the unique ability to think critically. Therefore, when AI provides material, don’t take the answer for granted – use its source to validate and enhance the insight. AI can signpost the content – but humans must explore the context. Remember, AI is the tool – but AI in partnership with human intelligence is the true sweet spot when it comes to being grounded.

Authenticate representation

Using AI-powered avatars gives learning designers the ability to create content that reflects the diversity of their employees and clients. This representation in learning can make underserved groups more visible and promote a sense of belonging. However, using ‘synthetic’ humans comes with some grounded questions. Is it authentic? Is it a true representation? Is it effective? Because if your design team lacks diversity and genuine understanding of the avatar’s genuine reality, there’s a risk of reinforcing stereotypes and distorting the voices of under-represented groups with the designers’ own biases – whether they are aware of them or not. To make sure your representation is authentic, it’s a good idea to involve a real person who embodies the avatar’s characteristics to review your work – checking the language, voice and overall presentation. This approach enhances the avatar with a real identity, turning it from a simple puppet into a true representative.

This article has presented the opportunities and risks recognised within three assurances but, in short, to expand the scope of your DEI efforts, it’s crucial to effectively combine human and AI collaboration. Make use of the natural strengths possessed by your team members – particularly their curiosity, creativity and critical thinking – but don’t stop there. The unlocking of tomorrow’s people potential lies in your ability to problem solve with AI today.

Nic Girvan is the director of learning and delivery at GP Strategies.

Nic Girvan

Learn More →