Tag

inclusion Archives - Inclusioneering

2023 Year in Review

By Uncategorized No Comments

Wow, what a year!

šŸ˜¶ I quit my job as Director of People Data and Insights at AVEVA
šŸ˜… … to run Inclusioneering Limited full time
āœ… I joined the trustee board of BCS, The Chartered Institute for IT
āœ……and the board of directors of ForHumanity, an AI ethics and audit charity
šŸ˜ I had the pleasure to work with amazing clients who continually inspire me:
TransFIRe HubMaterials Made Smarter Centre, LucideonVBuddies – Metaverse Tech LtdScott Bradbury, EIT Food
šŸ¤ … on projects spanning strategy development, surveysresearch and data analysis, leading the Foundation Industry EDI working group, learning and development, and digital privacy, safety and equality
šŸ« and with top universities including Durham, Leeds, Nottingham, Cambridge, and Sheffield
šŸŽ¤ 30 Public speaking engagements
šŸ‘Øā€šŸ‘©ā€šŸ‘§ā€šŸ‘§ Expanded my network of outstanding associates who co-deliver Inclusioneering’s work
šŸ“ƒ Inclusioneering Limited was shortlisted for the StartUp Awards Social Enterprise of the Year
šŸ… And winner of SME News Corporate Inclusion Consultancy of the Year
šŸ¤© I met one of my heroes, Dame Stephanie Shirley, for afternoon tea at the Savoy (still in shock this happened! šŸ‘€)
šŸ’Ž And continued to work with another, Professor Sue Black OBE ā¤ļø
šŸ‘ Judge for the ComputerWeekly.com Most Influentual Woman in UK Tech award
ā­ 4 visits to the Houses of Parliament, on tech, diversity and Ethnicity Pay Gap topics, and joined 1 government roundtable on AI

I owe thanks to very many people for enabling all of this to happen.

That’s the highlights reel. The reality is of course far more mixed, and I want to balance the showcase with some of the challenges that have accompanied it, because life is complex and that needs to be seen too.
ā¬› Discovering the rollercoaster of running your own company full time, during a downturn in demand for diversity, equity and inclusion projects
ā¬› Cutting my income by over 50%, plus no more pension or benefits
ā¬› Multiple family health challenges, many still ongoing
ā¬› Subsidence causing progressive damage to my home, 18+ months into an insurance claim with no resolution or end in sight
ā¬› The challenge to maintain my own wellbeing from what can be emotionally intensive work, plus all the other stuff above

On balance, it’s all been worth it, and looking forwards to 2024

An AI generated image showing two horses made of cogs, each representing artifical intelligence bolting away from us.

Fostering digital inclusion: Your actions make a difference.

By D&I Innovation No Comments

By Rebekah Bostan, Parul Wadhwa, and Jo Stansfield

Have you ever wondered what you can do, as an individual, to foster digital inclusion in your organisation? 

An AI generated image showing four horses made of cogs, each representing artifical intelligence bolting away from us.

Colleagues are not powerless in the adoption of AI by their organisations and are likely to play a vital role in ensuring inclusive adoption and usage. To help you Making Chance in Insurance (MCII) and Jo Stansfield have put together a list of actions individuals can take to support inclusive digital transformation and AI adoption.Ā 

AI will transform all types of businesses and is transforming insurance by increasing personalisation, reducing operating costs and supporting new product development, but we need to ensure that the adoption is as inclusive as possible at all stages of development. This includes the design, development, deployment and use of the technology and importantly its overall governance. In particular, we need to ensure that we do not end up excluding some groups and deskilling others given the current speed of digital transformation and AI developments. 

ā€œTechnological progress has to be designed to support humanityā€™s progress and be aligned to human values. Among such values, equity and inclusion are the most central to ensure that AI is beneficial for all.ā€

Francesca Rossi, AI Ethics Global Leader, IBM

You may or may not have direct influence in all the areas highlighted, but either way, we think you should know what best practice looks like and know where possible you can influence the development of more inclusive digital adoption and implementation.

1 )šŸšØGet your learning hat on – you are not a helpless bystander in the accelerated adoption of AI

AI, and especially generative AI, provides a direction/co-pilot and even tools to undertake our work more efficiently, but humans still have a responsibility for how it is used. The June 2023 PWC Workforce Hopes and Fears Survey found that 57% of respondents felt that AI would positively support them in acquiring new skills or improving their productivity. Unfortunately, most organisations are not yet providing sufficient staff training, so it is important for individuals to start engaging with technology, such as generative AI, themselves.

Some great personal upskilling tools we have found include:

ACTION
āš”Commit to upskilling yourself – make a weekly plan

2) šŸŽÆRecognise the Automation Bias and value the need for a ā€˜human-in-the-loopā€™ 

Checks and balances are vital to minimise humans’ over-prioritising automated decision-making system results over their own judgement, especially as automation is only likely to increase. 

Be aware of the human tendency towards ā€œautomation biasā€, which means we place trust in automated systems, even when there may be information available that demonstrates they may not be trustworthy.  Consequently, we may follow suggestions or actions by the system uncritically and fail to take into account other sources of information.  Alternatively, we may fail to notice warning signs from the system because we are not monitoring it with sufficient diligence.

ACTION
āš”Be aware of the risk of falling for automation bias. In your interactions with AI systems, be diligent in monitoring the system, know its limitations, and seek other sources of information to verify the outputs. 

āš”Find out what checks and balances your organisation is putting in place to minimise automation bias – and where you find these may be lacking highlight the issue.

3) šŸŒˆ Accept that inclusive digital transformation requires multiple layers of defence (starting with technical operations and oversight, then moving to audit)

Organisations have three main layers of defence to ensure inclusive digital transformation. It is likely that as an individual, you will be directly or indirectly involved in one of these layers of defence. Taking time to understand what your organisation is doing to support these defence layers will help you to understand where there might be gaps and or ineffective defences.

Layers of defence
ACTION:
āš” Be willing to ask challenging questions about your organisation’s barriers of defence,  whatever your level in an organisation. Here are some suggestions:

Technical operations: 
How are we assessing that the system is operating as intended on an on-going basis, and managing the risk of unintended consequences?

Oversight and accountability: 
What is our data ethics framework for how to use data appropriately and responsibly?
If you are part of an ethics committee, ask: Do we have sufficiently diverse inputs represented across technical and data ethics skillsets, and across our stakeholder community?

Here are some great resources to learn more about data ethics:
McKinsey article on Data ethics 
ForHumanity article on the Rise of ethics committees 

Audit
Do we have sufficient transparency and maturity of approach to enable an effective audit?

4) šŸ“² Minimise Garbage in ā€“ Garbage out

AI has the risk of reinforcing our own prejudices because its outputs can only be as good as the data they learn from, which often includes inherent human biases.  However good the machine learning model, with poor quality data inputs, the model will learn incorrectly and produce poor quality results.  ā€œGarbageā€, or poor quality data, can include inaccurate data that is a poor representation of its target, incomplete data, inconsistent data, and data that is not valid for the purpose it is being used for.

ACTION
āš”Support the reduction or even elimination of data biases by being a strong advocate for data testing and evaluation of training data and ensuring that systems are put in place for continuous monitoring.  

āš”Report adverse incidents, donā€™t assume others will, and if your organisation doesn’t have a reporting system in place, advocate for that.

5) šŸ›”ļøUnderstand why protecting your and other peopleā€™s data is important

We live in a world where websites and other services are continuously requesting access to our data, and most of the time we provide it unthinkingly. Our data has become like an alternative form of payment, a cost to get access to some service.

But we donā€™t necessarily need to click ā€œyesā€ to get that access, and there are good reasons why we should keep our personal information private. Keeping your information private helps to protect against identity theft, keeps your financial information safe and protects you from possible discrimination.

This also matters when we are entrusted to work with other peopleā€™s data – and this includes in the context of AI systems that process data about people.  In the UK under GDPR, people have a set of rights, including being informed about the collection and use of their data, the right to object, and the right not to be subject to a decision based solely on automated processing.  

ACTION
āš”Read your organisation’s Privacy Policy and understand how you can report misuse

āš”Learn about your rights and responsibilities when it comes to personal data – you can find a great resource here.

6)šŸ’”Support successful adoption and usage of AI by fostering cross-organisational  intergenerational and intersectional dialogues

Intergenerational and intersectional dialogues on adoption and usage are important to enable effective and inclusive adoption. Activating cross-organisational information and skill exchanges can reduce uncertainty and bias to ensure a more successful adoption of AI. This can include Reverse Mentoring, where younger employees can teach senior staff about new technologies and use cases. Intersectional workshops are another important way to bring together employees from different age groups and communities to solve business challenges, including inclusive AI adoption. 

ACTION: 
āš”Look for opportunities to enrich your understanding of the impact of AI usage and adoption by calling for intergenerational and cross-organisational dialogues within your organisation.

7)šŸ› ļø  Increased tech-focused hiring is likely to exacerbate existing diversity gaps and even create an ā€˜age gapā€™ – counter this by advocating for colleague upskilling

Whilst organisations are likely to substantially increase their tech-focused hiring over the next few years, the scale of need means that existing staff will also need to be upskilled/reskilled. McKinsey notes that hiring an employee can cost up to 100% of their annual salary but upskilling/reskilling costs under 10%. A focus on upskilling existing employees is also likely to support the closure of diversity gaps, as pure tech hires tend to still be less diverse than the general population. Reskilling can also prevent an ā€˜age gapā€™ appearing where older workers are effectively deskilled and prematurely exit organisations. 

ACTION: 
āš”Advocate for more AI upskilling and reskilling opportunities across all departments and levels so no one is left behind.  

MAKING CHANGE IN INSURANCE

We are an inclusive and supportive group of insurance and technology-focused change makers who have regular challenging conversations about diversity, equity and inclusion. We then share our insights with the wider insurance and tech community. We would love you to join us at our next event – please contact Rebekah Bostan, Caroline Langridge, Areefih Ghaith or Parul Wadhwa for further information.

The above action plan was influenced by an MCII discussion in October 2023 where we explored actions individuals can take to foster inclusive AI adoption and digital transformation within organisations.

Big Ben

Influencing AI Regulations: We must act now to mitigate risk and ensure AI is used to equally benefit all across society

By D&I Innovation, Social mission No Comments

Iā€™m still pinching myself to have had the opportunity and privilege to join a governmentĀ AIĀ roundtable yesterday, focussing on the proposals in the AI Regulation White Paper. It was great to join a diverse group of leaders – as one of us said, each bringing a different piece of the jigsaw puzzle, but with great alignment across us.

Jo Stansfield, Director of Inclusioneering, with leaders from across the UK AI landscape, assembled for the roundtable event

My perspective

ā­ļøAlthough AI is new and rapidly developing, situations of powerful global organisations, intense commercial pressure, and fragmented regulatory landscape are not. Think Deepwater Horizon disaster in 2010. Lessons were learned then, resulting in regulatory reform and industry-wide transformation to establish a culture of safety.

ā­ļøOur risks from AI are not in the futuristic AI domination scenarios, but already present in the disparate impacts and harm to underserved and vulnerable communities we see happening now.

ā­ļøLet’s learn lessons now from other industries, to avert AI’s Deepwater Horizon event. We must move quickly to strengthen and clarify regulation and establish a culture of responsible innovation.

ā­ļøIn the interim, incentives are needed for organisations to prioritiseĀ responsible,Ā inclusive innovation. In Inclusioneering’ equity,Ā diversityĀ andĀ inclusionĀ practice, we see UKRI funding embedding EDI requirements to be an effective catalyst for organisations, who are responding positively, and innovatively, to embed EDI into their work

ā­ļøDiverse inputs are needed to fully understand the risks and impacts, as well as opportunities – across all stakeholder communities. I recommend an inclusive design approach to development of regulation

I will be spending time to write papers on these thoughts, so more to come on all these topics. Drop me a message using the contact form or in the comments if you’d like to receive a copy. Which are you most interested to explore more deeply?

Q&A: All things Inclusion in Tech Part 2

By Social mission

In the conclusion of this two-part interview, Inclusioneering founder, Jo Stansfield, joins Bella Ikpasaja in a wide ranging discussion of all things inclusion and diversity in tech.

Listen in to learn:

  • Tools that Agile can offer to organisations to develop inclusive cultures
  • For the creative industries, how tech can inspire leaders and hiring teams
  • Hopes for the future of AI
  • What we’re seeing “on the ground” regarding digital skills
  • Looking ahead 10 years, how we see the social impact of tech evolving

If you prefer to read, you can find the the full text for this part of the interview on Bella’s medium blog.

And don’t forget to check out Part 1 if you missed it!

Q&A: All things Inclusion in Tech Part 1

By Social mission

Inclusioneering founder, Jo Stansfield, joins Bella Ikpasaja in a wide ranging discussion of all things inclusion and diversity in tech, as the inaugral speaker in her series of interviews with writers, organisational psychologiest and diversity and inclusion practitioners.

Listen to Part 1 to learn:

  • What inspired the creation of Inclusioneering
  • How Jo’s experiences returning to work after becoming a parent ignited her passion for DEI
  • How diversity impacts business outcomes
  • Effective ways to boost STEM skills for underserved communities
  • Shifts in tech and DEI that could play out over the next 5 years

If you prefer, you can read the full interview on Bella’s medium blog.

And don’t forget to tune in to listen to Part 2.