When technology is a tool versus our friend

Tool or friend?

I personally enjoyed Jonathan Zittrain’s discussion on how tech companies can shift algorithms from being a “tool” to being a “friend.” From my understanding, algorithms act as a tool when they give us results regardless of the potential outcome, and act as a friend when they work for us, the user. For instance – Zittrain showed that if you typed the word “Jew” into Google some of the first search results were anti-semitic websites. This is an example of an algorithm acting as a tool rather than a friend for the user. However, years later, these anti-semitic websites are no longer the first result, showing that Google has changed its algorithm. This is one of those situations where Google may be trying to change the algorithms from “tool” to “friend.” Google may have accepted social responsibility to remove harmful search results.

However, I feel that Jonathan Zittrain’s predictions that tech companies could make algorithms that are not friendly to users are becoming true. In August, the Intercept first reported that Google was in the process of making a censored search engine for internet users in China. This censored search engine can link search results to a user’s phone number, blacklist terms like “student protest,” and could replace air pollution results with doctored data sources from China. This is clear scenario where Google is making a tool that is a friend to the shareholders and certain government bodies, but not a friend to the actual user. Many have criticized this move as Google losing their moral compass. 

There are many other examples like this where companies create algorithms that are clearly not meant for the user, but for the company. In my tech marketing role, I’ve truly learned how algorithms can work for and against users. There are tools like “Full Story” that allow you to watch recorded sessions of individual users exploring your website. While this is a friendly tool for marketers, it doesn’t offer much privacy for users who are involved. As someone who works in the tech industry, I often ponder my own role of creating and using tools that are not friendly to users. I avoid marketing tactics that overly-rely on user data, and try to create content based of ethical principles and data.

The human-machine relationship

We can also see this “tool” versus “friend” discussion in our readings this week. Dr. Chayko focuses on what she calls the human-machine relationship in chapters 8 – 10 of Superconnected. She explores this concept by discussing how children are using and becoming dependent on technology at ever-younger ages: “Children often receive their first phones from caregivers seeking to keep them safe in the event of emergencies . . . many caregivers also do not want their children to be on the wrong side of a perceived digital divide. Owning a cell phone can be an indicator or status, wealth, or power.”

I remember getting my first cellphone in elementary school, but it was only supposed to be used for emergency situations. Receiving a cellphone was significant to me because hardly any other kids had one and it felt like I have been given a special privilege. And back then, this was just a simple flip phone – there wasn’t much to do on it except call my parents. However, by the time I was in high school, smartphones had become a thing and almost everyone had one. I wanted one too, not because I needed one for an emergency, but because of everything it could do.

In just a ten year timespan, our use of cell phones have flipped from being something to use in a state of a emergency to something you can use for almost anything, convenience. In a way, our cell phones have transformed from “tool” to “friend” in many ways – we can easily request a ride, find a place to eat, and text our friend along the way. But this much convenience has also lead to an over-dependence on our phones. I wouldn’t say it’s the reason we are “addicted” to our cell phones though. We are not addicted to convenience, we are addicted because of how the algorithms have been designed.

Social media news feeds are addicting because they track what we are interested in and continuously show us topics that are related to our interests. While keeping our new feeds relevant and interesting is a nice “friend-like” feature, it is not designed for us, but designed to keep us using the application. Today’s UX designers and engineers carry huge social responsibility to design mobile interfaces that are not addictive. An article on the Adobe Blog suggests that UX designers are “responsible for keeping users rights protected and their experiences enjoyable, but ethical as well.” When engineers and UX designers feel like shifting algorithms for users, they must first ask themselves if there are any ethical consequences of making these changes. 

One of the best things that we can do is educate the next generation on these harmful algorithm practices. Not so long ago, I read an article that Gen Z is quitting social media in droves. I’m not sure how true this is, but it does give me hope that the next generation is thinking about the ways algorithms and technology affect them.

Posted on September 23, 2018, in Social Media, Society, Trust and tagged , . Bookmark the permalink. 8 Comments.

  1. I found this passage in the first part of your post most intriguing:

    While this [“Full Story”] is a friendly tool for marketers, it doesn’t offer much privacy for users who are involved. As someone who works in the tech industry, I often ponder my own role of creating and using tools that are not friendly to users.

    The thing I always like about Zittrain’s talks is that he tries to bring things back to the ethics involved, most likely because he’s a cyberlaw scholar. In fact, it’s because of him that I became aware of this recent NYTimes piece on data exploitation.

    Many students in this course often refer to their workplace in blog posts and subsequent final papers, but I wonder if you want to push at what you have started reflecting on here in terms of your role in creating tools for users?

    • Definitely, I can discuss my role a bit (more so as a marketer than a creator of tools, though). I believe it is my role to design ethical and educational websites, product releases, and articles for users. As a content creator, I need to obtain genuine trust from users by creating educational content. There are some practices of content marketing that can get into gray spots though.

      For instance, it has become common practice to put certain types of content behind walls so users have to provide their personal information before they get access to it. I disagree with this practice and avoid it as much as possible. However, there is pressure to use these practices because many believe it is better at producing leads.

      Many content marketers, like Joe Pulizzi, have shown that the best performing pieces of content are the ones that are genuinely educational and made freely available to users.
      (i.e. companies can drive more leads by just being educational). Whenever I work at a new company, I try to teach this practice of content to those I work with. It can be difficult though, because many still believe that walled content is better.

  2. Can you explain the concept of the Full Story tools a bit more? I’m not familiar with them and would be intrigued to hear an insider user’s view.

    A major take-away from your post is that our jobs are becoming multi-layered in all the ethical and technological implications placed on content creators. I wonder if this will lead to more corporations and agencies establishing an ethics/regulations division, but I cynically doubt it if they are not “required” to do so.

    We’re trying to tackle this at our college by creating more critical reading assignments and even designing courses on information literacy. As tech savvy as many of us feel, because technology changes so rapidly, we may not be aware of the latest changes that applications make (I certainly don’t always read the user agreement each time an app updates my software).

    • FullStory (https://www.fullstory.com/), once plugged-in to your website, allows you to watch recorded sessions of a user’s interaction with your website. As they put it, you can see the site through their eyes – all their clicks, all the pages they visit, etc.

      Any user who visits your site will have their interactions recorded and they can be played back at anytime. This is an amazing usability tool because marketers can directly see the problems users may encounter on their website and more. It is almost better than a usability test because the user doesn’t face the same pressures of feeling like they are being watched. It allows you to see how a user naturally interacts with your website.

      While the user’s identity is protected (for the most part), there are controls that allow you to filter by the user’s location and more. While incredibly useful from a marketing standpoint, it does make you wonder if companies should have this level of visibility. Especially since the user the has no idea that you are recording their interactions.

  3. Hi Jeffrey,

    I really enjoyed reading your post.

    Thank you for including the Google clip, which I watched three times (by the third round, I was mostly considering the potential backlash). Though I don’t necessarily disagree with Google’s potential approach, I can’t help but predict a major fallout in the form of Googler anarchy.

    Like you, my very first experience with a cellphone was largely a safety precaution, as I would borrow my parents’ phone while out with friends, in the event of an emergency. Looking back on it, how on earth did we ever fit those bulky, awkward devices into our pants pockets?

    You do a nice job of dissecting the ‘tool’ versus ‘friend’ analysis. Your analytical breakdown is especially helpful for someone like me, who combined the two sides early on while struggling to consider them as separate components of technology. Perhaps I am too ‘Best of both worlds’ minded.

    Also, I like your take on social media news feeds, which certainly have become subjectively addicting in accordance with each respective user’s needs and interests. How unsettling is it that technology can very quickly and sneakily get to know us better than we know ourselves?

    Great work!

    ~Jeff

    • Hi Jeff,

      I can definitely see why Google wants to get back in China with its censored search engine. However, the implications of a censored search engine and allowing a Government/foreign business to have control of what gets filtered is an unethical business decision. While there is backlash against Google, I feel that it will unfortunately be forgotten soon.

      It is definitely unsettling how addicting our new feeds can be. I hope next week’s reading cause me to write a happier and most positive post about technology next week, haha.

  4. Hi Jeffrey,

    I personally liked your interpretation the video offered by Jonathan Zittrain and its relation to algorithms. As the user becomes more acclimated to “Google,” searching or the internet in general do you think this has any correlation with how algorithms began to adjust over time. Are the users becoming more skilled, more or less reliant on technology and what about the features in which search engines often use such as “predictability” to try to help you through searching.

    In an article I found it highlights another perspective on the use of predictability and the impacts its having for not only users, but also search results. You can view the full article here: https://qz.com/527008/an-algorithm-can-predict-human-behavior-better-than-humans/

    The article even references this, “But a new MIT study suggests an algorithm can predict someone’s behavior faster and more reliably than humans can.” What are your thoughts on this and how do you see this impacting or altering new technology down the road?

    In the latter part of your post you reference the first time in which you were introduced or had the opportunity to become a user of a cell phone to the transition now and how the intricacies and affordances changed. I enjoyed how you highlighted the concept of your cell phone being labeled a “tool” and how the “tool” resource transformed into the label of “friend.” While this can be comical to refer to your phone as a “friend,” I see a lot of physical interaction where users rely more on these devices now as a “friend.” Do you think this partly do to being able to show or display what’s happening in your social circle 24/7?

    Great post and insight. The topic and changes being made daily in the field of technology are so intriguing and at times can be very challenging to keep up with.

    – Kim

    • On Google’s predictability and how it will affect the future, I often find it interesting how Google filters search results once it starts to understand the user’s behavior. I believe it is one of the reasons why our nation feels so divided – our search and news feeds are leading us to entirely different sources. Predictability is causing us to become more divided.

      Regarding mobile phones, I feel social media websites make it incredibly easy to post what you are doing 24/7. It is a nice “friend-like” feature because of its ease of use. It is certainly a lot easier to post something on your social media feed than to start a conversation in real life or sit there and do nothing. We appear to be friends with our phones because it is more convenient, and easier to communicate and do things online.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.