Impact on bias in Data analysis: A take on Gender Bias
Posted in 2020
In today's data-driven world, gender bias has emerged as a crucial source of bias that affects data analysis, in turn impacting algorithmic decision-making, and human-computer interactions. This essay will discuss the impact of gender bias on data analysis and decision-making processes.
Women make up 49.58%, i.e, half of the world's population. However, the number of educated women is less. However, since any advancement in human history, women’s views, needs and requirements are ignored in research, surveys and product design, causing all the data and statistics to be male-biased. It results in what Carolina Perez, in her book Invisible Women, called a “men as Human Default” society. Perez emphasizes that these silences, these gaps, have consequences. They impact women’s lives every day . The research and surveys conducted will be biased towards men. The statistics and results will lack the needs of half the world's population. When your data is corrupted by silences, the truths that you get are half-truths and often, for women, they aren’t even true1 . It leads to the development of everyday objects or surroundings based on male norms. During the design of a product, if there is no woman in the team whose feedback and needs can be examined, then how can the product work for women? Perez complains about this gap, from doors that are heavy for average women to open with ease, to high shelves, to glass stairs and floors through which anyone below can see up the skirt, to pavings where heels get stuck easily. The voice recognition used in smartphones is again male-biased. In 2016, Rachel T atman, a research fellow in linguistics at the University of Washington, found out that currently the best in Speech recognition software-Google were 70% more likely to recognise a male speech than female speech1 . This fact was also verified when Perez observed her mother failing to get the voice recognition right but got it to work, for the first time, when she lowered her pitch1 . The same goes for facial recognition software. This is widely used in daily lives in our phones as face-id unlocks, security systems and law enforcement, but studies have shown that it is less accurate at recognizing the faces of women and people of colour. This is because these systems were largely trained on data sets that are dominated by white men, which means that they may not work as well for other groups. The field is functioning on the image of a white man by default.
Women’s absence in research leads to their absence in data collection research. Healthcare algorithms are increasingly being used to make decisions about patient care, such as predicting who is most at risk for certain diseases. Such algorithms and many medical devices are designed with male patients in mind, which can lead to misdiagnosis or incorrect treatment for women. For example, heart attack symptoms can be different in women than in men, but diagnostic tests are often based on male data. This can lead to women being misdiagnosed or undertreated for heart disease. As noted by Journalist Marguerite Del Giudice, women were misdiagnosed as their body type wasn't considered in medical research . She claims that countless women with heart disease have been misdiagnosed in emergency rooms and sent home because the fact that women can exhibit different symptoms from men for cardiovascular disease was not known2 . Perez acknowledges that women's heart attacks got undiagnosed as their symptoms were deemed ‘atypical1 . Furthermore, the research stated in Invisible Women provides evidence that the earliest research into cardiovascular disease was conducted on men, and women continue to be underrepresented, making up only 25% of participants across the thirty-one landmark trials for congestive heart failure between 1987 and 20121 . As there was no research conducted on female bodies, women's bodies were less understood by medical research and women's health suffered and many times led to death.
The ruckus caused on social media a few weeks ago when a breakthrough in Male birth control was made shows the lack of understanding and bias in the world. The men were shocked to see the number of side-effects the pill could cause but the fact that women have been taking this pill for years and it isn’t talked about (the pill isn’t just used to stop population growth, it has other uses too and it is another debate about if we want the population to be regulated).
Women are bad drivers. I didn’t say this. It is a common phrase among men. Now, as self-driving cars become more common, there are concerns that they may be less safe for women, as they are more likely to be involved in accidents caused by male drivers. Many cars aren’t completely driver-free, it has autopilot modes and the driver is expected to be seated and take some decisions but with less attention. This is because most autonomous vehicles are trained on data sets that are dominated by male drivers, which means that they may not be as good at recognizing and responding to the driving behaviours of women. A recent study published in Scientific Reports, suggests that such driverless should have different settings for gender and age . This is the case because women are better at taking back control of the vehicle when required to respond to a hazard. Women are also more hesitant about autonomous or driverless cars.
Virtual reality systems are designed to create immersive experiences, but many of them are designed with male users in mind. This can make them less effective or uncomfortable for women, who may have different body shapes and sizes. For example, VR headsets may not fit properly on women's faces, or they may be too heavy for some women to wear comfortably. “It's extremely intimate technology, but it's made for someone else,” says Adi for the Verge. She talks about how much she has to tighten the headset to fit it to her and how far away the lenses are from her eyes when they should be close. The motion rings have a large gap. The VR headset is sexist. The VR headset is a failure. Well, women are half the population and just got ignored and weren’t considered when designing products.
There are numerous reasons for this gap in data analysis and women’s absence. It can range from simply the world and men being sexist to the women’s gap in education and the workforce. Research and surveys are essential parts. It is a sampling bias and could be a part of someone’s conscience to simply just interview and survey men. It is difficult to propose a solution to this issue- do we want a clause for all current and future companies to check on women’s needs and requirements, do we want more women in these fields or simply educate men to not be sexist? “When women are involved in decision-making, in research, in knowledge production, women do not get forgotten. Female lives and perspectives are brought out of the shadows”1 . Well, that’s another debate. However, this absence of female data in research, surveys and product design is bad and can even be fatal.