When technology is a tool versus our friend
Posted by jeffreyuw
Tool or friend?
I personally enjoyed Jonathan Zittrain’s discussion on how tech companies can shift algorithms from being a “tool” to being a “friend.” From my understanding, algorithms act as a tool when they give us results regardless of the potential outcome, and act as a friend when they work for us, the user. For instance – Zittrain showed that if you typed the word “Jew” into Google some of the first search results were anti-semitic websites. This is an example of an algorithm acting as a tool rather than a friend for the user. However, years later, these anti-semitic websites are no longer the first result, showing that Google has changed its algorithm. This is one of those situations where Google may be trying to change the algorithms from “tool” to “friend.” Google may have accepted social responsibility to remove harmful search results.
However, I feel that Jonathan Zittrain’s predictions that tech companies could make algorithms that are not friendly to users are becoming true. In August, the Intercept first reported that Google was in the process of making a censored search engine for internet users in China. This censored search engine can link search results to a user’s phone number, blacklist terms like “student protest,” and could replace air pollution results with doctored data sources from China. This is clear scenario where Google is making a tool that is a friend to the shareholders and certain government bodies, but not a friend to the actual user. Many have criticized this move as Google losing their moral compass.
There are many other examples like this where companies create algorithms that are clearly not meant for the user, but for the company. In my tech marketing role, I’ve truly learned how algorithms can work for and against users. There are tools like “Full Story” that allow you to watch recorded sessions of individual users exploring your website. While this is a friendly tool for marketers, it doesn’t offer much privacy for users who are involved. As someone who works in the tech industry, I often ponder my own role of creating and using tools that are not friendly to users. I avoid marketing tactics that overly-rely on user data, and try to create content based of ethical principles and data.
The human-machine relationship
We can also see this “tool” versus “friend” discussion in our readings this week. Dr. Chayko focuses on what she calls the human-machine relationship in chapters 8 – 10 of Superconnected. She explores this concept by discussing how children are using and becoming dependent on technology at ever-younger ages: “Children often receive their first phones from caregivers seeking to keep them safe in the event of emergencies . . . many caregivers also do not want their children to be on the wrong side of a perceived digital divide. Owning a cell phone can be an indicator or status, wealth, or power.”
I remember getting my first cellphone in elementary school, but it was only supposed to be used for emergency situations. Receiving a cellphone was significant to me because hardly any other kids had one and it felt like I have been given a special privilege. And back then, this was just a simple flip phone – there wasn’t much to do on it except call my parents. However, by the time I was in high school, smartphones had become a thing and almost everyone had one. I wanted one too, not because I needed one for an emergency, but because of everything it could do.
In just a ten year timespan, our use of cell phones have flipped from being something to use in a state of a emergency to something you can use for almost anything, convenience. In a way, our cell phones have transformed from “tool” to “friend” in many ways – we can easily request a ride, find a place to eat, and text our friend along the way. But this much convenience has also lead to an over-dependence on our phones. I wouldn’t say it’s the reason we are “addicted” to our cell phones though. We are not addicted to convenience, we are addicted because of how the algorithms have been designed.
Social media news feeds are addicting because they track what we are interested in and continuously show us topics that are related to our interests. While keeping our new feeds relevant and interesting is a nice “friend-like” feature, it is not designed for us, but designed to keep us using the application. Today’s UX designers and engineers carry huge social responsibility to design mobile interfaces that are not addictive. An article on the Adobe Blog suggests that UX designers are “responsible for keeping users rights protected and their experiences enjoyable, but ethical as well.” When engineers and UX designers feel like shifting algorithms for users, they must first ask themselves if there are any ethical consequences of making these changes.
One of the best things that we can do is educate the next generation on these harmful algorithm practices. Not so long ago, I read an article that Gen Z is quitting social media in droves. I’m not sure how true this is, but it does give me hope that the next generation is thinking about the ways algorithms and technology affect them.
This site uses Akismet to reduce spam. Learn how your comment data is processed.