Trends

Facial Recognition Is Rarely Able To Identify Trans And Non-Binary People

Facial recognition has been taking the world by the storm but could it be doing more harm than good?

We rarely ever think about how extensively we use facial recognition in our lives. Unlocking our phones, checking temperatures and artificially intelligent software in immigration, security and other integral operations are just some of the aspects where facial recognition has been gaining momentum over the past few years. While we may not think twice about a device scanning our face and deriving our information, it is a bit tedious for people who do not conform to the norm of the gender binary.

Studies show that facial recognition software have gender issues

A recent study by the University of Colorado Boulder showed major facial recognition tools that run based on IA were frequent in inaccurately identifying non-cisgender people by misgendering them. They conducted this study on 4 facial analysis software- Amazon’s Rekognition, IBM’s Watson, Microsoft’s Azure, and Clarifai

The researchers collected 2,450 images of people from Instagram who identified as a male, female, genderqueer, trans man, a trans woman and non-binary. The images were then divided on the basis of how they identify and put into respective groups. The facial analysis tool of each company was tested against each of these groups.

The results were a bit astonishing- with only cis-gendered men and women being accurately classified 98% of the time. Trans men were inaccurately classified about 30% of the time and the inaccuracies were much higher with trans women and non-binary or genderqueer people. One may ask why does this inaccuracy take place? Well, it is because the machine has only been taught to look at gender as binary i.e as male or female, so inherently the machine has only learnt that 2 genders exist and all faces must be labelled and categorised according to those 2 genders.

A machine’s idea of gender is binary

“We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders. While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.” said a PhD student at the CU Boulder university in a statement.

Os Keyes, studies the ‘intersection of human-computer interaction and social science’ and has attempted to understand why a machine’s understanding of gender was taught to be 2 dimensional by looking at research that spans over 30 years! He studied over 58 different research papers and understood that the reason machine’s view in gender was binary was because that’s how the researcher’s had programmed it over 90% of the time. Researcher’s also viewed gender as ‘unchanging’ over 70% of the time and viewed it as an outcome of one’s physiology over 80% of the time.

“The consequence has been tremendous underrepresentation of transgender people in the literature, recreating discrimination found in the wider world. Automatic gender recognition (AGR) research fundamentally ignores the existence of transgender people, with dangerous results,” he says.

Facial recognition software has biases against races too

A recent study by the MIT media lab on Amazon’s Recognition software proved the bias software of facial recognition. Adding to the growing pile of evidence, the study highlighted instances where the software misidentified dark-skinned women as males about one-third of the time. Even white women were often mislabelled as white men. Overall a trend was noticed of how the software was labelling light skinned men more correctly as opposed to dark skinned women. This was especially seen in criminal cases where darker-skinned individuals were the targets of crimes that they did not even commit.

They are trying to ‘update’ their software

In response to these claims, Amazon commented on this issue on their website saying that Recognition “wasn't designed to categorize a person’s gender identity” and labelling a person’s identity is not the intent of their tool. The company states that its tool is best suited for cases that don’t involve specific users. “For example, the percentage of female users compared to male users on a social media platform,” Amazon writes.

While these softwares were invented with the intent of making our lives easier, it seems to be doing just the opposite of that. Misrepresentation or underrepresentation of a community has severe negative social implications like discrimination, social exclusion and complete disregard for their identity and freedom of expression. While these softwares were created with outdated understandings of gender in mind, efforts should be made by researchers to keep with times and strive for representation for everyone on the spectrum of gender. After all, isn't that what softwares are supposed to do? Update to accommodate new information and values to increase the quality of service and life.

Trends

Facial Recognition Is Rarely Able To Identify Trans And Non-Binary People

Facial recognition has been taking the world by the storm but could it be doing more harm than good?

We rarely ever think about how extensively we use facial recognition in our lives. Unlocking our phones, checking temperatures and artificially intelligent software in immigration, security and other integral operations are just some of the aspects where facial recognition has been gaining momentum over the past few years. While we may not think twice about a device scanning our face and deriving our information, it is a bit tedious for people who do not conform to the norm of the gender binary.

Studies show that facial recognition software have gender issues

A recent study by the University of Colorado Boulder showed major facial recognition tools that run based on IA were frequent in inaccurately identifying non-cisgender people by misgendering them. They conducted this study on 4 facial analysis software- Amazon’s Rekognition, IBM’s Watson, Microsoft’s Azure, and Clarifai

The researchers collected 2,450 images of people from Instagram who identified as a male, female, genderqueer, trans man, a trans woman and non-binary. The images were then divided on the basis of how they identify and put into respective groups. The facial analysis tool of each company was tested against each of these groups.

The results were a bit astonishing- with only cis-gendered men and women being accurately classified 98% of the time. Trans men were inaccurately classified about 30% of the time and the inaccuracies were much higher with trans women and non-binary or genderqueer people. One may ask why does this inaccuracy take place? Well, it is because the machine has only been taught to look at gender as binary i.e as male or female, so inherently the machine has only learnt that 2 genders exist and all faces must be labelled and categorised according to those 2 genders.

A machine’s idea of gender is binary

“We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders. While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.” said a PhD student at the CU Boulder university in a statement.

Os Keyes, studies the ‘intersection of human-computer interaction and social science’ and has attempted to understand why a machine’s understanding of gender was taught to be 2 dimensional by looking at research that spans over 30 years! He studied over 58 different research papers and understood that the reason machine’s view in gender was binary was because that’s how the researcher’s had programmed it over 90% of the time. Researcher’s also viewed gender as ‘unchanging’ over 70% of the time and viewed it as an outcome of one’s physiology over 80% of the time.

“The consequence has been tremendous underrepresentation of transgender people in the literature, recreating discrimination found in the wider world. Automatic gender recognition (AGR) research fundamentally ignores the existence of transgender people, with dangerous results,” he says.

Facial recognition software has biases against races too

A recent study by the MIT media lab on Amazon’s Recognition software proved the bias software of facial recognition. Adding to the growing pile of evidence, the study highlighted instances where the software misidentified dark-skinned women as males about one-third of the time. Even white women were often mislabelled as white men. Overall a trend was noticed of how the software was labelling light skinned men more correctly as opposed to dark skinned women. This was especially seen in criminal cases where darker-skinned individuals were the targets of crimes that they did not even commit.

They are trying to ‘update’ their software

In response to these claims, Amazon commented on this issue on their website saying that Recognition “wasn't designed to categorize a person’s gender identity” and labelling a person’s identity is not the intent of their tool. The company states that its tool is best suited for cases that don’t involve specific users. “For example, the percentage of female users compared to male users on a social media platform,” Amazon writes.

While these softwares were invented with the intent of making our lives easier, it seems to be doing just the opposite of that. Misrepresentation or underrepresentation of a community has severe negative social implications like discrimination, social exclusion and complete disregard for their identity and freedom of expression. While these softwares were created with outdated understandings of gender in mind, efforts should be made by researchers to keep with times and strive for representation for everyone on the spectrum of gender. After all, isn't that what softwares are supposed to do? Update to accommodate new information and values to increase the quality of service and life.

Trends

Facial Recognition Is Rarely Able To Identify Trans And Non-Binary People

Facial recognition has been taking the world by the storm but could it be doing more harm than good?

We rarely ever think about how extensively we use facial recognition in our lives. Unlocking our phones, checking temperatures and artificially intelligent software in immigration, security and other integral operations are just some of the aspects where facial recognition has been gaining momentum over the past few years. While we may not think twice about a device scanning our face and deriving our information, it is a bit tedious for people who do not conform to the norm of the gender binary.

Studies show that facial recognition software have gender issues

A recent study by the University of Colorado Boulder showed major facial recognition tools that run based on IA were frequent in inaccurately identifying non-cisgender people by misgendering them. They conducted this study on 4 facial analysis software- Amazon’s Rekognition, IBM’s Watson, Microsoft’s Azure, and Clarifai

The researchers collected 2,450 images of people from Instagram who identified as a male, female, genderqueer, trans man, a trans woman and non-binary. The images were then divided on the basis of how they identify and put into respective groups. The facial analysis tool of each company was tested against each of these groups.

The results were a bit astonishing- with only cis-gendered men and women being accurately classified 98% of the time. Trans men were inaccurately classified about 30% of the time and the inaccuracies were much higher with trans women and non-binary or genderqueer people. One may ask why does this inaccuracy take place? Well, it is because the machine has only been taught to look at gender as binary i.e as male or female, so inherently the machine has only learnt that 2 genders exist and all faces must be labelled and categorised according to those 2 genders.

A machine’s idea of gender is binary

“We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders. While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.” said a PhD student at the CU Boulder university in a statement.

Os Keyes, studies the ‘intersection of human-computer interaction and social science’ and has attempted to understand why a machine’s understanding of gender was taught to be 2 dimensional by looking at research that spans over 30 years! He studied over 58 different research papers and understood that the reason machine’s view in gender was binary was because that’s how the researcher’s had programmed it over 90% of the time. Researcher’s also viewed gender as ‘unchanging’ over 70% of the time and viewed it as an outcome of one’s physiology over 80% of the time.

“The consequence has been tremendous underrepresentation of transgender people in the literature, recreating discrimination found in the wider world. Automatic gender recognition (AGR) research fundamentally ignores the existence of transgender people, with dangerous results,” he says.

Facial recognition software has biases against races too

A recent study by the MIT media lab on Amazon’s Recognition software proved the bias software of facial recognition. Adding to the growing pile of evidence, the study highlighted instances where the software misidentified dark-skinned women as males about one-third of the time. Even white women were often mislabelled as white men. Overall a trend was noticed of how the software was labelling light skinned men more correctly as opposed to dark skinned women. This was especially seen in criminal cases where darker-skinned individuals were the targets of crimes that they did not even commit.

They are trying to ‘update’ their software

In response to these claims, Amazon commented on this issue on their website saying that Recognition “wasn't designed to categorize a person’s gender identity” and labelling a person’s identity is not the intent of their tool. The company states that its tool is best suited for cases that don’t involve specific users. “For example, the percentage of female users compared to male users on a social media platform,” Amazon writes.

While these softwares were invented with the intent of making our lives easier, it seems to be doing just the opposite of that. Misrepresentation or underrepresentation of a community has severe negative social implications like discrimination, social exclusion and complete disregard for their identity and freedom of expression. While these softwares were created with outdated understandings of gender in mind, efforts should be made by researchers to keep with times and strive for representation for everyone on the spectrum of gender. After all, isn't that what softwares are supposed to do? Update to accommodate new information and values to increase the quality of service and life.

WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
WATCH VIDEO
Sex

What is Urinary Tract Infection (UTI) | sEXPERT

If you think you have any of these symptoms, run to the doctor asap!