Book Review # 2: “Invisible Women — Exposing Data Bias in a World Designed for Men”

Anurag Bhatia
3 min readMar 8, 2020

--

How come, in many cities across the world, it is relatively easier to commute using public transport, between outskirts (residential localities) and the downtown area (offices), than say going from one suburb to another? Short answer: Employed men usually need the earlier type of trips, while homemaker women the latter (e.g. going to grocery store, child day care centers etc.)

Is it good enough to know that the new medicinal drug in the market has been extensively tested on mice during trials? What if an overwhelming number of those mice happened to be male? Or even worse, the pharmaceutical company just didn’t bother to disclose this information.

This book is replete with examples where one entire gender comprising 50% of the humanity is not even considered when it comes to designing products and services meant for consumers (ironically both, male as well as female). Sounds implausible, right? Let’s consider something very dear to all of us: safety in the vehicles we use daily to commute. Regarding crash testing of cars, the regulators surely must be testing it for both, male as well as female body shaped dummies? Tough luck. What about ergonomics of the car seats: head-rest, seat-belts etc.? It turns out we are too naive to assume that this is being done keeping both genders in mind. Result: Numbers suggest female passengers are more likely to get injured and die in a car crash, as compared to men.

The book relentlessly drives home the message that we have been asking the wrong questions all along. Just to take one example, the problem is not just ‘tech-blind women’ (lack of women who are tech savvy). It is also as much about ‘women-blind tech’. Apparently, Apple Siri was much better at finding nearest brothels and viagra shops, rather than locating abortion clinics. Just a sheer coincidence? Take this: It was also found much better equipped at suggesting immediate response to a cardiac problem, than responding to a user who says “I have been raped”. And there is a real danger of machine learning algorithms ending up perpetuating these hidden biases even more. Yet another reminder to some of us working in data science, to be extremely careful about this.

The list just goes on and on. It’s not a conspiracy which, had it been one, once exposed, would have been relatively easy to unravel. Instead, it seems to be a manifestation of deliberate neglect of gender parity in collecting data over decades, cutting across industries, resulting from lazy — and more so, chauvinistic — men presumptuous enough to suggest that data collected only about their fellow men, will suffice. Worst part, that seems to have become the norm.

And though the intensity may very well vary, it’s clearly not just a third world problem either. Even at the best of times, women are considered as if they are nothing but scaled-down versions of men. e.g. What about the fact that their mass distribution and bone density usually differ, just to name a few?

The author argues that “gender neutral” policies are just not good enough, since the underlying data on which they are based, is terribly skewed in favour of men. Rather, the women are completely absent from most of those datasets and hence, become “invisible”. So many studies and facts have been shared by her to substantiate this claim. And as she mentions more than once in the book, for heaven’s sake, let’s just stop using the term “working women” since most women do work anyway. Whether they get paid for it or not, is another matter.

I strongly recommend everyone to read this book. It is an eye opener. It raises provocative questions and the findings are disturbing too. But I guess that’s the very first step to solve any problem: acknowledging that we have a huge one staring us in the face, with far reaching consequences.

Happy reading…! :)

--

--