The Rise of the Algorithm
Posted by baileybudnik
As I was just placing my phone down to start working on this post, I scrolled through one last Instagram story. In this story, a local account detailed their experience making homemade pumpkin pie today. In this multi-story post, they broke down each step while writing small sections of text that one would need to know in order to follow the recipe. As casual as this may seem, Christine Wolf’s piece touched on this scenario directly. Wolf states that Web 2.0 now “[allow s] users to generate and distribute their own content can support informal and self-directed learning” (Wolf, 2016.) After reading this statement from “DIY Videos on Youtube” by Christine Wolf, I immediately connected to what I just engaged with moments before. Personally, I never have baked a homemade pumpkin pie from scratch, but the organic content generated from the Web 2.0 structure allowed me to self-learn a new skill.
Now, I only was able to see this story pop up first on my feed since the Instagram algorithm tracked that I watch this specific account’s stories routinely. Thus, this leads the story to be seen first on my homepage when I log into the application. I was not presented with that DIY baking video by accident, rather, the algorithm is in control.
As a person who has dabbled in creating their own creative content, I struggle to appease the social media algorithm. It seems as though it maintains a mind of its own, placing content and audiences into invisible groups. Algorithms can guide us to engage in our passions and niches on internet platforms. At times, algorithms can even present us with new content that we end up admiring. However, algorithms also box-in and categorize users into certain groups. On one end, this seems like a helpful tool to present users with content they enjoy while keeping them logged into the platform.
Subsequently, algorithms also stop the transfer of new knowledge into the user’s algorithm created bubble. For example, if I only watch YouTube videos on how to garden, there is a large chance I will never be presented with exercise content. Maybe the user would not want to see different content, but maybe they would want to be brought outside their echo chamber as well. This situation that Wolf mentions reminds me of another Web 2.0 experience I had just a few days ago. It is true that “algorithms also shape individuals’ experiences with Web 2.0 platforms in many ways” (Wolf, 2016). I recently downloaded Tik Tok, mostly to understand why the application serged into popularity this year. Before downloading it, my only concept of it was a platform where people choreographically danced to music in small sixty-second videos. That type of niche did not interest me at all, but when I created my Tik Tok account, totally different content filled the screen. I was surprised to notice that the same niches I followed on other platforms appeared in my Tik Tok feed as well. It was almost like the overarching Web 2.0 algorithm already knew what I would like. Instead of watching dances, I interacted with the application in a totally different way than what I thought I would. My experience and concept of the application changed due to the algorithms prior notions about me.
As interesting as I find this, I feel like it also has negative consequences as well. Gone are the days where content is completely and freely flowing into your space if there was ever such a day!
This site uses Akismet to reduce spam. Learn how your comment data is processed.