I owe thanks to very many people for enabling all of this to happen.
That’s the highlights reel. The reality is of course far more mixed, and I want to balance the showcase with some of the challenges that have accompanied it, because life is complex and that needs to be seen too. ā¬ Discovering the rollercoaster of running your own company full time, during a downturn in demand for diversity, equity and inclusion projects ā¬ Cutting my income by over 50%, plus no more pension or benefits ā¬ Multiple family health challenges, many still ongoing ā¬ Subsidence causing progressive damage to my home, 18+ months into an insurance claim with no resolution or end in sight ā¬ The challenge to maintain my own wellbeing from what can be emotionally intensive work, plus all the other stuff above
On balance, it’s all been worth it, and looking forwards to 2024
By Rebekah Bostan, Parul Wadhwa, and Jo Stansfield
Have you ever wondered what you can do, as an individual, to foster digital inclusion in your organisation?
Colleagues are not powerless in the adoption of AI by their organisations and are likely to play a vital role in ensuring inclusive adoption and usage. To help you Making Chance in Insurance (MCII) and Jo Stansfield have put together a list of actions individuals can take to support inclusive digital transformation and AI adoption.Ā
AI will transform all types of businesses and is transforming insurance by increasing personalisation, reducing operating costs and supporting new product development, but we need to ensure that the adoption is as inclusive as possible at all stages of development. This includes the design, development, deployment and use of the technology and importantly its overall governance. In particular, we need to ensure that we do not end up excluding some groups and deskilling others given the current speed of digital transformation and AI developments.
āTechnological progress has to be designed to support humanityās progress and be aligned to human values. Among such values, equity and inclusion are the most central to ensure that AI is beneficial for all.ā
You may or may not have direct influence in all the areas highlighted, but either way, we think you should know what best practice looks like and know where possible you can influence the development of more inclusive digital adoption and implementation.
1 )šØGet your learning hat on – you are not a helpless bystander in the accelerated adoption of AI.
AI, and especially generative AI, provides a direction/co-pilot and even tools to undertake our work more efficiently, but humans still have a responsibility for how it is used. The June 2023 PWC Workforce Hopes and Fears Survey found that 57% of respondents felt that AI would positively support them in acquiring new skills or improving their productivity. Unfortunately, most organisations are not yet providing sufficient staff training, so it is important for individuals to start engaging with technology, such as generative AI, themselves.
Some great personal upskilling tools we have found include:
ACTION: ā”Commit to upskilling yourself – make a weekly plan
2) šÆRecognise the Automation Bias and value the need for a āhuman-in-the-loopā
Checks and balances are vital to minimise humans’ over-prioritising automated decision-making system results over their own judgement, especially as automation is only likely to increase.
Be aware of the human tendency towards āautomation biasā, which means we place trust in automated systems, even when there may be information available that demonstrates they may not be trustworthy. Consequently, we may follow suggestions or actions by the system uncritically and fail to take into account other sources of information. Alternatively, we may fail to notice warning signs from the system because we are not monitoring it with sufficient diligence.
ACTION: ā”Be aware of the risk of falling for automation bias. In your interactions with AI systems, be diligent in monitoring the system, know its limitations, and seek other sources of information to verify the outputs.
ā”Find out what checks and balances your organisation is putting in place to minimise automation bias – and where you find these may be lacking highlight the issue.
3) š Accept that inclusive digital transformation requires multiple layers of defence (starting with technical operations and oversight, then moving to audit)
Organisations have three main layers of defence to ensure inclusive digital transformation. It is likely that as an individual, you will be directly or indirectly involved in one of these layers of defence. Taking time to understand what your organisation is doing to support these defence layers will help you to understand where there might be gaps and or ineffective defences.
Layers of defence
ACTION: ā” Be willing to ask challenging questions about your organisation’s barriers of defence, whatever your level in an organisation. Here are some suggestions:
Technical operations: How are we assessing that the system is operating as intended on an on-going basis, and managing the risk of unintended consequences?
Oversight and accountability: What is our data ethics framework for how to use data appropriately and responsibly? If you are part of an ethics committee, ask: Do we have sufficiently diverse inputs represented across technical and data ethics skillsets, and across our stakeholder community?
Audit Do we have sufficient transparency and maturity of approach to enable an effective audit?
4) š²Minimise Garbage in ā Garbage out
AI has the risk of reinforcing our own prejudices because its outputs can only be as good as the data they learn from, which often includes inherent human biases. However good the machine learning model, with poor quality data inputs, the model will learn incorrectly and produce poor quality results. āGarbageā, or poor quality data, can include inaccurate data that is a poor representation of its target, incomplete data, inconsistent data, and data that is not valid for the purpose it is being used for.
ACTION: ā”Support the reduction or even elimination of data biases by being a strong advocate for data testing and evaluation of training data and ensuring that systems are put in place for continuous monitoring.
ā”Report adverse incidents, donāt assume others will, and if your organisation doesn’t have a reporting system in place, advocate for that.
5) š”ļøUnderstand why protecting your and other peopleās data is important
We live in a world where websites and other services are continuously requesting access to our data, and most of the time we provide it unthinkingly. Our data has become like an alternative form of payment, a cost to get access to some service.
But we donāt necessarily need to click āyesā to get that access, and there are good reasons why we should keep our personal information private. Keeping your information private helps to protect against identity theft, keeps your financial information safe and protects you from possible discrimination.
ACTION: ā”Read your organisation’s Privacy Policy and understand how you can report misuse
ā”Learn about your rights and responsibilities when it comes to personal data – you can find a great resource here.
6)š”Support successful adoption and usage of AI by fostering cross-organisational intergenerational and intersectional dialogues
Intergenerational and intersectional dialogues on adoption and usage are important to enable effective and inclusive adoption. Activating cross-organisational information and skill exchanges can reduce uncertainty and bias to ensure a more successful adoption of AI. This can include Reverse Mentoring, where younger employees can teach senior staff about new technologies and use cases. Intersectional workshops are another important way to bring together employees from different age groups and communities to solve business challenges, including inclusive AI adoption.
ACTION: ā”Look for opportunities to enrich your understanding of the impact of AI usage and adoption by calling for intergenerational and cross-organisational dialogues within your organisation.
7)š ļø Increased tech-focused hiring is likely to exacerbate existing diversity gaps and even create an āage gapā – counter this by advocating for colleague upskilling
Whilst organisations are likely to substantially increase their tech-focused hiring over the next few years, the scale of need means that existing staff will also need to be upskilled/reskilled. McKinsey notes that hiring an employee can cost up to 100% of their annual salary but upskilling/reskilling costs under 10%. A focus on upskilling existing employees is also likely to support the closure of diversity gaps, as pure tech hires tend to still be less diverse than the general population. Reskilling can also prevent an āage gapā appearing where older workers are effectively deskilled and prematurely exit organisations.
ACTION: ā”Advocate for more AI upskilling and reskilling opportunities across all departments and levels so no one is left behind.
MAKING CHANGE IN INSURANCE
We are an inclusive and supportive group of insurance and technology-focused change makers who have regular challenging conversations about diversity, equity and inclusion. We then share our insights with the wider insurance and tech community. We would love you to join us at our next event – please contact Rebekah Bostan, Caroline Langridge, Areefih Ghaith or Parul Wadhwa for further information.
The above action plan was influenced by an MCII discussion in October 2023 where we explored actions individuals can take to foster inclusive AI adoption and digital transformation within organisations.
Iām still pinching myself to have had the opportunity and privilege to join a governmentĀ AIĀ roundtable yesterday, focussing on the proposals in the AI Regulation White Paper. It was great to join a diverse group of leaders – as one of us said, each bringing a different piece of the jigsaw puzzle, but with great alignment across us.
My perspective
āļøAlthough AI is new and rapidly developing, situations of powerful global organisations, intense commercial pressure, and fragmented regulatory landscape are not. Think Deepwater Horizon disaster in 2010. Lessons were learned then, resulting in regulatory reform and industry-wide transformation to establish a culture of safety.
āļøOur risks from AI are not in the futuristic AI domination scenarios, but already present in the disparate impacts and harm to underserved and vulnerable communities we see happening now.
āļøLet’s learn lessons now from other industries, to avert AI’s Deepwater Horizon event. We must move quickly to strengthen and clarify regulation and establish a culture of responsible innovation.
āļøIn the interim, incentives are needed for organisations to prioritiseĀ responsible,Ā inclusive innovation. In Inclusioneering’ equity,Ā diversityĀ andĀ inclusionĀ practice, we see UKRI funding embedding EDI requirements to be an effective catalyst for organisations, who are responding positively, and innovatively, to embed EDI into their work
āļøDiverse inputs are needed to fully understand the risks and impacts, as well as opportunities – across all stakeholder communities. I recommend an inclusive design approach to development of regulation
I will be spending time to write papers on these thoughts, so more to come on all these topics. Drop me a message using the contact form or in the comments if you’d like to receive a copy. Which are you most interested to explore more deeply?
In the conclusion of this two-part interview, Inclusioneering founder, Jo Stansfield, joins Bella Ikpasaja in a wide ranging discussion of all things inclusion and diversity in tech.
Listen in to learn:
Tools that Agile can offer to organisations to develop inclusive cultures
For the creative industries, how tech can inspire leaders and hiring teams
Hopes for the future of AI
What we’re seeing “on the ground” regarding digital skills
Looking ahead 10 years, how we see the social impact of tech evolving
If you prefer to read, you can find the the full text for this part of the interview on Bella’s medium blog.
And don’t forget to check out Part 1 if you missed it!
Inclusioneering founder, Jo Stansfield, joins Bella Ikpasaja in a wide ranging discussion of all things inclusion and diversity in tech, as the inaugral speaker in her series of interviews with writers, organisational psychologiest and diversity and inclusion practitioners.
Listen to Part 1 to learn:
What inspired the creation of Inclusioneering
How Jo’s experiences returning to work after becoming a parent ignited her passion for DEI
How diversity impacts business outcomes
Effective ways to boost STEM skills for underserved communities
Shifts in tech and DEI that could play out over the next 5 years
If you prefer, you can read the full interview on Bella’s medium blog.
Jo Stansfield shares insights gleaned from a Diversity and Inclusion Roundtable event bringing together recruitment and HR professionals from across Cambridge and the surrounding region.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Recent Comments