Category

D&I Innovation

Crowd walking over binary code

Diversity data is just like working with other types of data – and it is also very different.  Here’s how.

By D&I Innovation No Comments

As a former techie, I have always loved data. Data analysis is like exploration.  It’s a journey of discovery as something previously concealed, but often hiding in plain sight, is revealed.

People data is no different in this regards.

I started working with people data and analysis almost 10 years back.  I was an employee, in a technical role at a software company, and with support of our CTO and head of HR I volunteered to analyse our people data through the lens of gender.

To find significant results that we could act on was exciting. I did find those results. And we did make positive changes. But it was also shocking. I could place myself within the findings, and all of the implications implicit in that for my future path.  That brought a whole host of feelings.

There is huge responsibility to working with people data. Responsibility for data protection, making sure privacy is protected and data is only used for its intended purpose, by people authorised to do so. Responsibility to clearly communicate what data is collected, and why. Careful consideration to how we communicate about results, ensuring there is a clear plan of action to address disparity.

We must always remember that data alone never tells the whole story. Beneath those data points are people’s lives and experiences.

When I share findings from data, what I hear back are people’s stories.

Data is a powerful tool to highlight those stories that need to be heard. Used well, data gives context.  It sheds light on what’s systemic, helps show what’s important and direct where to take the most impactful action.

Algorithms can be discriminatory even when they aren’t automated

By D&I Innovation No Comments

👀Here’s how it happens, what to be aware of, and how we need to manage all types of algorithmic decision making.

Back early in my career, I was settling in to a senior software engineer role at a new organisation when they put our entire team at risk of redundancy. HR shared that they would be using a decision making tool that assessed the way we’d used leave – favouring people who took it as long chunks, rather than small pieces. This was meant to correspond to productivity, somehow.

I suffer from migraines, and can lose up to 2-3 days per month being ill – usually resulting in single days off. I was terrified of what that meant for my chances.

But this wasn’t just me. There are myriad other reasons this algorithm could discriminate, impacting people who

💊Need to manage their health considering long term illness or disability

🤗 Care for children, parents and other dependents

🙏 Celebrate religious festivals that aren’t observed by the dominant culture

I give credit to my then employer for their transparency in the process, and swift rectification when alerted to the issue.

❓But what happens when we automate these processes?

⚠️ Automation both speeds them up and scales them out. They happen faster, and at greater scale.

⚠️ And we may lose transparency and cut out the human in the loop that oversees them.

It’s important we carefully examine processes with algorithmic decision making – any that follow a strict set of rules – for embedded bias, automated or otherwise.

PLUS Always be sure to
✅ Be transparent
✅ Keep oversight and assign accountability
✅ Make sure decisions can be explained
✅ Have a process to raise concerns

Especially post-covid, with greater flexibility in our work patterns – are you certain that flexibility that’s enabled and encouraged in one way isn’t punished by some other process or system?

Big Ben

Influencing AI Regulations: We must act now to mitigate risk and ensure AI is used to equally benefit all across society

By D&I Innovation, Social mission No Comments

I’m still pinching myself to have had the opportunity and privilege to join a government AI roundtable yesterday, focussing on the proposals in the AI Regulation White Paper. It was great to join a diverse group of leaders – as one of us said, each bringing a different piece of the jigsaw puzzle, but with great alignment across us.

Jo Stansfield, Director of Inclusioneering, with leaders from across the UK AI landscape, assembled for the roundtable event

My perspective

⭐️Although AI is new and rapidly developing, situations of powerful global organisations, intense commercial pressure, and fragmented regulatory landscape are not. Think Deepwater Horizon disaster in 2010. Lessons were learned then, resulting in regulatory reform and industry-wide transformation to establish a culture of safety.

⭐️Our risks from AI are not in the futuristic AI domination scenarios, but already present in the disparate impacts and harm to underserved and vulnerable communities we see happening now.

⭐️Let’s learn lessons now from other industries, to avert AI’s Deepwater Horizon event. We must move quickly to strengthen and clarify regulation and establish a culture of responsible innovation.

⭐️In the interim, incentives are needed for organisations to prioritise responsible, inclusive innovation. In Inclusioneering’ equity, diversity and inclusion practice, we see UKRI funding embedding EDI requirements to be an effective catalyst for organisations, who are responding positively, and innovatively, to embed EDI into their work

⭐️Diverse inputs are needed to fully understand the risks and impacts, as well as opportunities – across all stakeholder communities. I recommend an inclusive design approach to development of regulation

I will be spending time to write papers on these thoughts, so more to come on all these topics. Drop me a message using the contact form or in the comments if you’d like to receive a copy. Which are you most interested to explore more deeply?

WATCH NOW: Investigating the Agile Inclusion Paradox

By D&I Innovation

We know women and people from racial minorities are underrepresented in the technology industry.  Even amongst those who join, their progression and retention doesn’t match up to that of their white male colleagues.

Yet Agile methodologies appear to match many best practices for building inclusion. So, why then, does tech have such a pipeline problem? 

In this webinar, recorded at the Women Who Code Connect Reimage Conference, Jo shares her research of diversity in tech teams, and practical steps we can all take to build greater inclusion and equality.

Watch now to:

  • Gain new insights from research about gender and race in tech teams
  • For individuals: Learn how you can gain more traction in your career
  • For allies, teams and organisations: Learn how to support progression and retention of people from underserved groups in your tech teams