Romania
EN

Technology: Who’s being included? Who’s being ignored?

JUMP TO SECTION

When we examine whether technology is really serving us, we have to interrogate who we include when we think of us. In the attempt to increase productivity and make our work lives more convenient, we can unconsciously inconvenience others.

Hilary Harvey, Associate Partner at TPC UK sheds light on the matter, “The fact that we embrace technology because it’s new and it makes our lives convenient means we don’t often stop to ask, ‘Whose life isn’t being made convenient through this? Who’s not being included as a user in terms of the design thinking?’”

“As human beings with conscious and unconscious biases, we structurally build them into algorithms. And particularly now when we are having such a global recognition of gender inequality and Black Lives Matter, tech is a fundamental part of it, and needs to be a part of the conversation.”

Non-neutral algorithms

Tech is not independent of the people who design it. “We, as the ordinary lay person, assume that tech is neutral,” says Hilary. “But discrimination underpins our systems and our tech.”

Maybe we compartmentalise tech in our minds, disassociating it from social issues and relegating it to a category that includes numbers, equations and engineering. But we forget that since tech has started dealing with personal data its impact has become personal too.

“There are so many examples of discriminatory tech in social media.” Hilary points to Twitter, who in an attempt to moderate comments, created an algorithm that could take “toxic” comments down automatically. But if two people say words to the same effect, a black person’s comments are twice more likely to be taken down than a white person’s. “And that’s just the algorithm.”

And the consequences of discriminatory tech go far beyond social media. “In the UK the algorithm-decided exam results discriminated against a certain demographic of the population,” says Hilary. “And then just recently there was the Visa fast track program, which fast tracked you if you were white.”

The feedback loop of inequality

“It doesn’t matter how many times organisations say they’re up for equality,” says Hilary. “If it isn’t reflected in the tech, we won’t progress. The inequality will be embedded and we won’t even see it because it is a part of our daily lives.”

The consequences can be significant – affecting our society, our businesses and our boardrooms. This was evident when Carnegie Mellon University uncovered that Google’s ad-targetting system was six times more likely to advertise high-paying jobs to men than women.

“A common mistake is training an algorithm to make predictions based on past decisions from biased humans,” said a Metis senior data scientist in an interview with Live Science. And this kind of logic has creeped into tech used by the legal justice system.

ProPublica’s analysis of Northepointe’s COMPAS formula revealed that the algorithm was “likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.” And in addition, “White defendants were mislabeled as low risk more often than black defendants.”

Then in an attempt to predict crimes before they occurred, PredPol’s machine learning algorithm sent police to locations with a high minority population. And because of the system’s feedback loop, the newly increased number of arrests in that area mean that the algorithm is more likely to send police back there again.

True investment in the future

In our hurry to speed up processes, such as the UK passport photo check service, we can overlook underrepresented groups, which is why it is imperative to bring those groups into all levels of tech design and implementation.

“That’s why it’s so important to have inclusion in education,” says Hilary. “And for education programs to sustain inclusion and high levels of diversity throughout the career pipeline.”

If equality isn’t addressed in the tech, we will inevitably support a systemic problem, even if our intentions are apparently neutral. And as AI technology continues to develop and influence more areas of our lives and businesses, that injustice will only grow. As the U.S rep. Alexandria Ocasio-Cortez says, “If you don’t fix the bias, then you are just automating the bias.”

In the wake of the pandemic, businesses are leaning on technological ecosystems more than ever before. And any issues that were present before are becoming more prevalent. It is now essential for us to examine who our technology is serving.

“Leaders have a responsibility to have a critical mindset,” says Hilary. “And to ask the question, ‘How do we know that tech is neutral?’

Share this article:

Topics:

Tagged:

Read Next

“Othering” and the impact on teams and organizations

Christian Scholtes, TPCL Global Chair and the partner from the TPCL Romanian office, delves into the complex and often overlooked phenomenon of “Othering” and its profound impact on teams and

Listening to the quietest voice in the room. A conversation with Rob Shaw, Leading in Operations Programme Manager at BP. EP #10

“… I think for me when you’ve got a team, first of all, you have to have a very, very clear purpose to the team. Why does this team exist?

Select Your Location and Language

Use our site switcher to easily navigate between our different offices (in your preferred language where available), or select “Global” for our head office.

Global
EN
Romania
EN

Local Sites

Belgium
EN
Belgium
FR
Belgium
NL
Brazil
EN
Brazil
PT
France
EN
Germany
DE
Germany
EN
India
EN
Italy
EN
Middle East
EN
Netherlands
EN
Netherlands
NL
Romania
EN
Switzerland
EN
Türkiye
EN
UK
EN
USA
EN