I recently read the book to occupy myself during boring
lectures, and I have to say, it was a real hoot. If you're looking for a book
that will make you laugh out loud while also making you think about the ways
technology can perpetuate social inequalities, this is the book for you.
Broussard, a professor of data journalism at New York
University, uses examples from the worlds of journalism, advertising, and
criminal justice to illustrate her points. For example, she discusses how
algorithms used in the criminal justice system can be biased against certain
groups and how the lack of diversity in the tech industry can lead to products
and services that only serve the needs of a narrow segment of the population.
Broussard's main argument is that technology is not a
neutral force, despite what we may be led to believe. Instead, she argues,
technology is shaped by the biases and values of its creators, and it can
perpetuate and amplify social inequalities if we're not careful. And let me
tell you, as someone who spends way too much time on social media, I can
confirm that Broussard's argument is spot on.
One example Broussard discusses in the book is how
algorithms used in the criminal justice system can be biased against certain
groups. As a college student, I don't have any personal anecdotes about the
criminal justice system, but I do have a story about a particularly terrible
online quiz I took once.
It was one of those "Which Marvel Superhero Are
You?" quizzes that are all over the internet. I was feeling particularly
bored and decided to take the quiz just for fun. The quiz had a series of
questions about my personality and preferences, and at the end, it was supposed
to tell me which hero I was the most
like.
But when I got my result, I was shocked. According to the quiz, I was most like...Drumroll please...Kingo, Kumail Nanjiani’s character. Now, don't get me wrong, Kingo is a great character, but I couldn't help but feel like the quiz was biased against me because of my race. I'm Indian, and I couldn't help but wonder if the quiz was using stereotypes to match me with a character.
This may seem like a silly example, but it illustrates
Broussard's point that technology is not neutral. Even something as seemingly
innocuous as an online quiz can be biased and perpetuate stereotypes.
In my own experience in college, I've noticed that many of
my male classmates are much more confident and assertive than my female
classmates. This is something that is often attributed to societal expectations
and gender roles, but it can also be linked to the lack of diversity in the
tech industry. When you're constantly surrounded by people who look and act
like you, it's easy to feel like you belong and to speak up and share your
ideas. But when you're in the minority, it can be much harder to find your
voice and assert yourself.
One of the things I appreciated about this book is that Broussard
doesn't just point out the problems with technology; she also offers solutions.
For example, she argues that we need to be more intentional about creating
technology that serves the needs of all people, not just a select few. This
means having a more diverse workforce in the tech industry, as well as being
more deliberate about how we design and use technology.
As a college student, I think this is an important message.
We're the next generation of tech workers and consumers, and we have the power
to shape the future of technology. But to do that, we need to be aware of the
biases and inequalities that exist in the industry, and we need to be proactive
about addressing them.
Of course, it's not just up to individuals to solve these
problems. Broussard also argues that we need systemic change, including better
regulation of technology companies and more accountability for the harm that
technology can cause. This is something that I think is especially relevant in
light of recent controversies around social media and online privacy.
One personal anecdote I have about this is the time I
accidentally shared a really embarrassing post on Instagram. I won't go into
the details, but let's just say it involved a lot of typos and some ill-advised
political opinions. I quickly deleted the post, but not before a few of my
friends had seen it.
The experience made me realize how little control we have
over our online presence, even when we think we're being careful. Social media
platforms like Meta and Twitter collect massive amounts of data about us, and
they can use that data to target us with ads, influence our opinions, and even
sell our information to third parties. It's a little scary when you think about
it.
Broussard gives several examples of this in the book. For
instance, she talks about how facial recognition software has been shown to be
less accurate in identifying people with darker skin tones. This is because the
algorithms used to train the software were based on data sets that were
overwhelmingly composed of lighter-skinned people.
This might not seem like a big deal, but when you consider
how facial recognition technology is being used by law enforcement agencies to
identify and track individuals, it becomes clear that these biases can have
serious consequences. If the technology is less accurate at identifying people
of color, it could lead to innocent people being wrongly accused of crimes, or
it could result in people being unfairly targeted by law enforcement.
As a college student, I think it's important to be aware of
these issues and to think critically about the technology we use every day. We
can't just assume that because something is new or innovative, it's
automatically good. We need to ask questions like, who is this technology
benefiting? Who is it harming? What biases might be built into the system?
For example, when I tried to negotiate for a reasonable
price, the app translated my request as "take no more than a thousand
rupees, you are looting us non-locals." This was not what I intended to
say, and it made me look rude and insensitive. I realized that the app was
using a literal translation of my words, without taking into account the
nuances of the language or the culture I was in.
This experience made me realize how important it is to be
mindful of the technology we use, especially when it comes to communication and
cross-cultural interactions. We can't just rely on apps and algorithms to do
all the work for us; we need to be aware of the limitations and biases of these
tools, and be willing to put in the effort to communicate respectfully and
effectively with people from different backgrounds.
In conclusion, "More Than a Glitch" is a book that
challenges us to think critically about the technology we use and the impact it
has on our lives. It's a funny and engaging read, but it's also a call to
action. As students, we have a responsibility to be informed and engaged
citizens of the digital world, and this book is a great place to start. So,
pick up a copy, and let's get to work.