Category Archives: Digital
In Superconnected, Mary Chayko discusses the inception of Google. It was developed by Stanford PhD students Larry Page and Sergey Brin and revolutionized the internet when the search engine became publicly available in the late 90s and created algorithms in the early 2000s. Today, Google is the world’s leading search engine.
“At the same time that it produces results for the user, Google also stores, caches, and archives large portions of web content as the web is being searched…Apple, Microsoft, Facebook, Yahoo, and other major tech companies also allow the data that flows in and through their platforms to be mined and in some cases participate in the mining. As a result, nearly everything that is done on the internet is tracked, analyzed, stored, and then used for a variety of purposes,” Chayko writes.
Google Accumulates Power
In May of this year, Steve Kroft of the TV news magazine 60 Minutes reported on the power of Google and critics who say the company, worth three quarters of a trillion dollars, is stifling competition. Google, which is owned by the holding company Alphabet, went public in 2004. It has also bought more than 200 companies including YouTube, the largest video platform, and Android, which runs 80% of smartphones.
In the 60 Minutes story, Gary Reback, a well-known antitrust lawyer, says Google is a monopoly. He says it’s a monopoly not only in search, but also other industries such as online advertising. Plus, Google accumulates information about users and sells that information to advertisers. He points out that people tell search engines more than they tell their spouses, giving Google a “mind-boggling degree of control over our entire society.”
The Business Insider reports Google is also a major player in the news industry, surpassing Facebook last year as “the leading source of traffic to news publishers’ websites according to Chartbeat…the majority of traffic to publishers’ websites from mobile devices.”
Google Dominates its Competition
Also, in May, the Wall Street Journal’s Christopher Mims wrote about the growing demand to break up the monopolies of Google, Facebook, and Amazon. He writes, “…as they consolidate control of their markets, negative consequences for innovation and competition are becoming evident.”
Jonathan Taplin, a digital media expert, says in the 60 Minutes story that Google has no real competition because it has 90% of the search market and Bing, Microsoft’s search engine, has 2%. The co-founder of Yelp, Jeremy Stoppelman, points out that Google has changed its search results over the years so that instead of returning the best information from around the internet, results at the top of the first page are often from Google properties. Google lists results from its own data first such as maps, restaurant reviews, shopping, and travel information. This is especially important when many users are viewing results on the small screen of a mobile phone.
Google Faces Regulation
Google has been fined by the European Union for anticompetitive actions. Over the summer, the EU slapped Google with a $5 billion fine. According to the Business Insider, the EU ordered Google to stop using its Android operating system to block competitors. Google is appealing that fine. Last year, the EU fined Google $2.7 billion for illegally promoting its shopping search results over its competitors.
The U.S. government should follow the example of the EU and provide more oversight of Google and other tech giants. It’s clear that Google is a powerful force in society, and with the company’s dominance comes the need for transparency and accountability. Recently, Google, Facebook, and Twitter have been called to testify and answer questions at U.S. Congressional hearings regarding Russian interference in the 2016 U.S. elections. An Axios article by David McCabe had more ideas on how the government could provide oversight:
- Require Google to release more information regarding its algorithms
- Make it easier to sue big tech companies like Google
- Designate it as a “common carrier” which would allow the government to appoint a body to oversee Google
All of these options should be considered, and more should be done to make sure Google and other powerful tech companies do not wield too much influence over our lives without our knowledge and consent. It should be noted that I relied heavily on Google to research this blog post.
Hart-Davidson hits the nail on the head, Content Management Systems (CMS) “do not do that work by themselves” (p. 14). A CMS can give a company what they are willing to put into it. They are not a solution, they are a tool. They are exactly what we make of it. Hart-Davidson states that “technical communicators typically come to play many different roles and deploy diverse sets of skills over the course of a career” when using CMS (p. 134). The roles mentioned must be assumed, but to successfully integrate the CMS into the company, the company must also integrate one or more company processes into the system to really benefit from it.
Training or some kind of education on how the company uses a CMS is a key to success. I’ve used quite a few systems and have seen excellent and poor uses of them in companies. When companies don’t have any rules around how a CMS is used, it becomes a free-for-all of good and bad information. It’s confusing. There is a plethora of online content available online for learning how to use and manage CMS systems online. However, even if you know how to use the system, this may not be how the company uses it. The video below only touches on some common mistakes in administrating SharePoint itself and it’s over an hour long.
Michael J. Salvo and Paula Rosinski both discuss “mapping” and “signposting” in information design (pp. 112-114). These concepts are a big part of UX and extremely important to ensure users can become literate in a system. I’ve found these levels of user interface designs are not well applied to most CMS. At one of the companies I worked for I had to redesign the front-end of a SharePoint site to make it more accessible and simplified for others in the company. This tells me that we have a long way to go in our design of CMS from a design perspective. Confusion in using the interface itself will almost surely create inconsistent data, especially when most people will have access to the system.
Process in how you use a CMS is key to making the system useful. Yes, it can allow versioning of documents, but when people are not required to update or sign off on documentation, it can create data that looks trustworthy but is not. Most systems have workflows integrated into them, but unless going through that workflow is a part of a sign off process for the deployment of a product, then why would people go through the hassle?
To make sure our documentation is trustworthy, my team and I will link our documents to specific releases of software. This way it will be clearer in what context you can assume a document may be relevant for. In terms of metadata we make sure that everything is under our team’s section in the system. We also have the option to tag certain customers if the document is specifically relevant to that context. The process we employ around this ensures that we do not have to continually maintain every document, but instead deploy documentation at our own pace and as needed.
I don’t think I could live without a CMS at a company these days, because the alternatives are much worse. But literacy in these systems remains a problem. This is probably due to the fact that the users are not the same as the customer. Additionally, I see many systems treated as a golden solution instead of a platform. It will be interesting to see how these systems and their usages evolve over time.
I found this week’s reading fairly awkward as it included software engineers as technical communicators. Software Engineer is a very misused term to begin with. Rachel Spilka’s book gave me the feeling that they used to be more document centric, but now they are more jack-of-all trades developers and managers, sometimes dev ops, and sometimes just programmers. A lot of software industry titles trend towards a jack-of-all trades type of job, hence the new title “Full-Stack engineer”. Full-stack engineers are usually developers who know all aspects of how to build a web application. Why pay multiple people when you can get just one that knows how to do everything? Initially, a technical communicator sounded like a far fetch in the software engineer’s knowledge tool-box.
When I was studying for my computer science degree, most professors seemed to verbally accept the fact that most of us were just not going to be gifted in the writing department. It was not a required or emphasized aspect even though I had a software engineering emphasis. In the industry, I cannot disagree with this either. Most legacy code I have worked on is not documented from the technical side at all. It’s not always because of talent or ability, but honestly the last thing most of my colleagues want to do after coding is sit down and write sufficient documentation for days after that. Additionally, one extra line of code has the potential to change most or all of a document on the system functionality. Documentation is looked at by our management as a nice to have, but it’s not a show-stopper if it’s not there. We are never interviewed on our writing skills. This first-hand knowledge made me raise an eyebrow when Spilka listed software engineers as technical communicators from the late 90’s to now.
What I realized part way through reading was that the documentation Rachel Spilka is referring to has changed just like how the job titles have changed. The documentation that a software engineer will generate is kind of dynamic and is not always a formal breed of documentation. Spilka states a couple times in the book that the job of technical communicators has changed audiences, that they have changed from being experts to novice. It seems to me that the responsibility for creating power user documentation has been assumed primarily by software engineers, architects and system engineers, while technical writers create more customer-facing or public documentation.
So, how do software engineers document? We document when we want to ensure that we don’t have to work more than we want. The documentation that we do produce is aimed at fellow engineers so we don’t have to repeat ourselves too much when new people are hired or start working on what we have already built. We also document for production systems for installation and troubleshooting guides for when things go very wrong. Both of these types of documents we call “playbooks” for our engineering sector. These playbooks seem very similar to the initial documentation that was created by technical communicators in the 70’s (Spilka, R., ed., 2010, 22).
We keep these playbooks on a content management system that is accessible by the entire company, so if they want they can just go to our page and try to find the answer to their question before talking to us. We can also receive comments on the content management system so that all discussions on the documentation are public. Sometimes the documentation just looks like notes and sometimes it looks like a proper installation document depending on its purpose. We also document even less formally by creating static and dynamic charts and graphs for the design of our system. These can be the most useful in explaining functionality to other software engineers. We also document by putting comments in code to explain exactly what we are trying to do algorithmically. All of these forms of documentation fully take advantage of the technological changes that have been granted to us to make technical communication more efficient.
This book was written in 2010 so I feel like a revision could occur to navigate even more technical communication responsibilities in businesses today. For example, System Engineers have a huge role in technical communication between all components of a technical product. I feel like this specific role could be very helpful in identifying where some of the technical communication responsibilities have been dispersed in today’s world. Spilka does mention that the content would probably be irrelevant for the types of companies that I work at. Additionally, every company is vastly different in how they incorporate technical platforms and integrate with engineering processes. I can only imagine the challenges Spilka encountered in trying to compile the history of technical communication.
This is my first course for my certificate requirements. I wasn’t totally sure I would “fit” into the MSTPC program since my background is literature, and I have limited experience with technical writing and media. I saw it as a challenge of my boundaries of knowledge. However, as a reader of some of the class material, I felt I was not part of the target audience since I am not familiar with technical writer jargon etc. Of course, if a reader cannot relate to the material, it is a struggle to maintain interest and focus. Nonetheless, I kept on reading. As I was reading Blythe, Lauer and Curran’s “Professional and Technical Communication in a Web 2.0 World,” I began to relate, to focus and to reflect.
I teach mainly composition at a technical college, yet we still devise our composition classes as if they were for a four-year college. I have had some of my students complain about having to take one writing class since they felt it didn’t pertain to their program. Of course, in the end they understand that any writing genre (mainly essays) will help them communicate more effectively in their careers. However, the set curriculum may not be sufficient if many of my technological-minded students are going into careers where more technical writing would be the norm.
A student who graduates from a technical school is more apt to be required to write similar forms of communication as mentioned in Blyth, Lauer and Curran’s report. Figure 1 (Blythe, Lauer and Curran, 2014, p. 273) lists research papers only on the bottom of the type most valued column; whereas, emails, instruction manuals, websites, presentations and blogs are at the top of both the list of most often used and most valued. So, perhaps I can begin making changes in my courses to meet the future needs of my students.
I am not discounting the value of essay writing and the objectives of our mandatory writing courses, for it does require the skills needed to do many of the more technical forms of writing. However, perhaps exposing students to other genres of writing would be beneficial in that it may attract the interest of a more tech-savvy (or interested) audience and may lead students to feel like they are getting more out of their course that they can apply directly to their programs and future careers.
Perhaps being a student again (not originally by choice) has reminded me of how my students feel when entering my required classes. Plus, this class is broadening my understanding of writing and the value of different forms of communicating in today’s technical world. Hopefully, my students will feel the same.
Has the democratization of the Internet turned us all into Kafka-esque cockroaches? Andrew Keen argues yes in his debate with David Weinberger. From Keen’s perspective, the Internet has stripped away traditional filters and given a voice to the masses — and the resulting clamor shows the worst of humanity. Instead of having gatekeepers in the form of publishers and traditional media sources to groom experts and present us with the best, the unaware Internet user is bombarded by amateurs and their trash.
Image from Books by Audra. http://www.booksbyaudra.com/2016/04/18/considering-kafka/
Weinberger takes the opposing viewpoint that the traditional media filters were flawed, and the Internet offers opportunity for everyday experts and untapped talent. He’s not alone in his assessment. Philip Tetlock created the Good Judgment Project on the premise of nonprofessionals making more accurate predictions than established experts. Tournament style, the project identifies the top two percent of “superforecasters” who don’t have any particular credentials but are amateurs with a knack for making predictions. Through Web 2.0, these individuals are now able to connect and share ideas in a way that was inconceivable just twenty years ago.
Interestingly, most of the articles that I saw about everyone being an expert through the leveling of the Internet were from about five to ten years ago. After that, it stopped being news. Now, it seems that the voice given to the masses is assumed and taken for granted. The last decade has softened it from a potential catastrophe to now just an accepted part of culture.
The twist is that the Internet is both still reliant on traditional gatekeepers and developing new types of filters. As we’ve discussed earlier in this course, the more content is created, the more significant it becomes to navigate and find the right content. Jonathan Zittrain discusses how Google and other search engines have become a de facto filter as people attempt to find material online. Zittrain talks about the tension between “neutral” search algorithms and Google’s moral responsibility to present quality, or at least accurate, sources. His talk acknowledges that most people have a knee-jerk reaction against search engines serving as a “Big Brother” and controlling what you see, but also don’t like the specific examples of overtly wrong or biased sites being at the top of search results. Even though anyone can contribute online, search engines and other tools for navigating the web still provide some basic form of filtering. The questions is how much power should we give them?
Even in light of the massive amount of user-generated content and the new ways of determining what has value, there is still a role for traditional gatekeepers to help audiences from being bombarded. This is good news for Keen who sees “professional intermediaries [as] arbiters of good taste and judgement.” For me, the example that comes to mind is Wikileaks. On one hand, it embodies the ultimate democratization of all information being released to the public online. On the other hand, nobody reads the thousands and thousands of released leaks, and the general public hears about only the top few items of interest as reported by major media outlets. The gatekeepers are still serving to prioritize the information and tell people what they care about.
Wikileaks releases unprecedented amounts of information online, but still relies on traditional filters to make sense of it. The Guardian: https://www.theguardian.com/news/datablog/2010/nov/29/wikileaks-cables-data
The New York Times just ran the article “WikiLeaks Isn’t Whistleblowing” that offers a scathing condemnation of the Wikileaks approach to “journalism” and argues that massive data dumps are inappropriate and counterproductive by not offering context for the information or discerning what is necessary to share. Tufecki writes, “Mass data releases, like the Podesta emails, conflate things that the public has a right to know with things we have no business knowing, with a lot of material in the middle about things we may be curious about and may be of some historical interest, but should not be released in this manner.”
Putting aside the other moral and privacy questions raised by Wikileaks, it serves as an extreme example of how the Internet enables a massive amount of content from all types of sources, while we’re still figuring out the role for filtering and gatekeeping. Keen warns that if we don’t find an answer, we’ll soon see the worst of ourselves reflected back in the Internet and discover our true cockroach nature.
Tufecki, Z. (4 Nov. 2016). Wikileaks isn’t whistleblowing. The New York Times. Retrieved from http://www.nytimes.com/2016/11/05/opinion/what-were-missing-while-we-obsess-over-john-podestas-email.html
Good Judgment. Accessed 5 Nov. 2016 https://www.gjopen.com/
Working together can create more meaning and bring more understanding of the world around us. The ideas in Chapter 4 of Net Smart by Rhiengold (2012) especially regarding collective intelligence and the function of the Internet to create communities, groups, and audiences that create a deeper meaning of what is happening around them is very powerful and applicable to our work with analyzing and reviewing social media principles as well as our work as technical communicators.
I have heard complaints from the generation before mine, professors, staff members, and students that came before, that the way we learn and take in information currently does not take the same amount of effort and time that it used to, thus we are as a whole not as smart as we could be, as they had to be in the world before the World Wide Web.
I wholeheartedly disagree. Are things different? Definitely. For the most part, we do not have to deal with card catalogs and worrying about not obtaining the library book we need because someone already has it out. But what we do have is mountains of information at our fingertips that needs to be read through, researched, analyzed, and ultimately accepted or discarded as useful to the project that need to be completed.
Thinking about it as the natural reaction our society has had to the advent of technology and connectedness, collective intelligence seems like a great place for us to be in.
“Now that we have gained access to digital tools that enable us to share what we know and aggregate small contributions into large knowledge repositories, a new level of collective intelligence is possible” (p. 160).
Just as a reality, it is fascinating how much I find myself depending on the opinions and knowledge of others in my personal and professional life.
I read Yelp reviews and will search through a few pages for tips and tricks about shopping: how to do it effectively, where to go for the best prices, and when to go to avoid the most foot traffic.
I use my coworkers as sounding boards when working on projects, running edits, changes, style issues, and new copy by one or more people to see how they react, even when we’re working on completely different projects.
This trend is so important to the way we think about knowledge and learning. It may seem like an obvious idea. We learn currently from teachers and professors, those who go to school and study techniques specifically to learn how to instruct and impart knowledge on others, but to my mind there is still so much stigma associated with the spirit of collective intelligence in schoolwork.
Beginning your career as a student, you do not learn that it is your right, I would say responsibility, to question the font of knowledge: a teacher. In order to retain control over groups of wild children, teachers must be seen as the ultimate authority in their spaces. As you grow older and become more comfortable with yourself and the idea that you have to have your own opinions and thoughts about the world around you, you are inundated with cultural norms and taboos. They are subjects you can’t bring up in public without receiving a negative reaction: sex, politics, and religion. There are other subjects that only apply to you and place you into a subgroup: race, gender, sex, socio-economic status, ethnicity.
By high school you have hopefully learned all the rules, overtly taught to you and covertly gathered by osmosis and have gone through puberty so hopefully you have become a version of yourself that can function in society. You have created PowerPoints and book reports and scientific models. But beyond being forced into groups by your teachers, it is still up to the teacher as the superior figure to create meaning and focus your attention on the facts and figures that you need to know.
That long analogy is meant to draw attention to the fact that with the Internet and social media, it is up to us to create meaning and monitor the information and knowledge being influenced and cultivated around us. I cannot say with complete certainty that children are reacting differently in classes. There are thousands of studies and reports about classroom teaching and management that are authored about the changes going on in classrooms because of technology and the Internet.
What works for me is the idea that we are demanding more of our teaching professionals and of ourselves than we have before. Yes, the Internet gives everyone a platform to shout their opinions from the rooftop (leading to a degradation of fields like traditional print media). It also gives us the ability to share what we know with each other, outside of the limits of a roundtables and desks with tiny chairs. Even outside the bounds of an online course taught by a PhD.
Rheingold, Howard. (2014). Net Smart: How to thrive online. MIT Press. Cambridge, Massachusetts.
In Net Smart, Howard Rheingold recognizes the same trend as Sherry Turkle of the historically unprecedented amount of available information through the Internet. However, Rheingold confronts the challenge of the volume and velocity of digital media with much more optimism. He sees it as a huge opportunity, if people understand the right strategies for managing it.
In his Tedx Talk “Attention: The New Currency,” Sree Sreenivasan argues that getting and keeping attention is critical for success in this world of overwhelming volume. Sreenivasan says, “It isn’t just that our attention spans are getting smaller and shorter but that there’s so much more stuff coming at us and so much more stuff competing for our attention.”
Rheingold makes the case that one way to handle the volume is increased mindfulness about what is getting our attention. He argues that the issue isn’t that multitasking is rewiring our brains, but rather that we do it without even being aware of it. The Washington Post article “Is the Internet Giving Us All ADHD?” suggests that although rates of ADHD are steadily increasing and the Internet facilitates behavior often recognized as ADHD, there is no evidence for a causal link. As the volume of information on the Internet continues to explode, we don’t need to fear possible brain damage, but rather be mindful about where we are putting our attention. Sreenivasan quotes Les Hinston, former publisher of the Wall Street Journal, as saying, “The scarcest resource of the 21st century is human attention.”
However, simply knowing where our attention is going is only the first step in managing information overload. In Chapter 2, Rheingold suggests a dashboard approach to “infotention.” Savvy users organize and manage content in a dashboard style so that they can easily access the most relevant and useful information. When you’ve decided how you want to prioritize your attention, the dashboard approach helps you organize the information that you’ve decided is worth your time.
A third strategy is relying on others as curators. Rheingold tells several cautionary tales about bogus websites and warns about the need for “crap detection.” However, being a “detective” and investigating the source for every website that you visit just makes the volume even more overwhelming. In my experience, leisure users rarely go through the trouble to research a site’s author and dig for source material. Instead, most users have the online news site that they always read, and they trust it — no further investigation necessary. I haven’t been able to find a comprehensive study, but I’m curious about the percentage of time that people spend online on just a handful of favorite sites. I’m guessing that for most people, the majority of their time online is on just a couple of sites that they have deemed as passing the crap detection test.
Beyond curating your own list of favorite sites, people turn to social curation. Just as Google uses the PageRank algorithm (Rheingold, pg. 83) to boost search results based on links from other sources, so we turn to the wisdom of the crowd to help us determine which information in the sea of possibilities should get our attention. I saw this article “Social Curation in Audience Communities” about how a Finnish newspaper deemed the participation of their readers in”liking” and sharing articles as one of the most critical factors to their success and how they used strategies to begin leveraging this social curation. The article includes the statistic that up to 75% of the online news consumed by American audiences is forwarded through email or social networking sites. You could argue that this is because of peer pressure, the desire to read what our friends are reading, or other social motivators, but I think it’s also a coping mechanism to handle the volume of information available. When there are too many options, one way to decide is to take the recommendation of others. I think it’s the same as asking your dinner date what you’re at a new restaurant and trying to pick from a huge menu.
Finally, Rheingold pushes us to go one step further: “Google itself is not the curator; we are. Every time a person references a link, they help to curate the Web.” (pg. 127). After we’ve waded through the huge amount of information and deemed what is reliable and attention-worthy, we can participate by becoming the curators. Theses 72 in the Cluetrain Mainfesto gets at this: “We like this new marketplace much better. In fact, we are creating it.” As a community of curators, we’re no longer just consumers of corporate rhetoric, but we are empowered to determine value for ourselves.
Three sails to staying afloat in information overload. Drawing from Coloring Son
Actually, Rheingold’s principles for being a “filter blogger” bear a surprising resemblance to what we do as technical writers. We take on a huge amount of information and distill it for what is important. Although technical writing then moves to the next step of content creation, it begins with managing and curating available information. We daily practice the skills of culling information and can appreciate the wealth of opportunities offered by the Internet without being swept away.
Dewey, C. (2015, March 25). Is the Internet giving us all ADHD?. Washington Post. Retrieved from https://www.washingtonpost.com/news/the-intersect/wp/2015/03/25/is-the-internet-giving-us-all-adhd/
Sreevnivasan, S. (2015, April 20). Attention: The new currency.” Tedx Broadway. https://www.youtube.com/watch?v=8I4WkhG_GRM
Villi, M. (2012). Social curation in audience communities: UDC (user-distributed content) in the networked media ecosystem. Journal of Audience and Reception Studies. 9.2. Retrieved from http://www.participations.org/Volume%209/Issue%202/33%20Villi.pdf
Before airing a new T.V. show, networks and studios test the pilot on an audience focus group. The audience members turn a knob based on their reaction to different parts of the episode, and their response can determine whether the show makes it to the screen or dies right there (“Test Audiences Can Make or Break New T.V. Series”).
In the technical communications world, understanding our audience and receiving audience feedback is also vital to creating high-quality documentation, but it’s much harder to achieve. Blakeslee writes about “the importance for technical communicators of continuing to give careful thought both to identifying their audiences and to accommodating their audiences’ needs and interests” (p. 200), yet she says that our industry has failed to investigate audience needs in the digital age. It seems to me that we misunderstand our audience in several ways, including their relation to technology, and the lack of audience awareness can severely limit our documentation.
One pitfall of not appropriately understanding our audience is falling into the activity theory framework, where we narrowly define our audience based on a single task instead of a comprehensive cultural understanding. As Longo states,
“If, as technical communicators, we make decisions based only on our understanding of activities and not of the cultural contexts in which these activities are embedded, we run the risk of proposing documents and systems that do not fit well with the organization where we work and our goals for the future” (p. 160).
At the company where I work, we constantly walk the line between specific task-oriented instructions balanced with a larger understanding of strategic and operational needs. Here are the steps to set up XYZ printer. Why? Because a certain type of medication label only prints on XYZ printer. Understanding that context, can we also guide readers about how many printers they’ll need and where to place them?
Not only do we need to learn about our audiences’ situation and goals, but we also need to learn about how the audience approaches the documentation itself based on their cultural context. In “Understanding Digital Literacy Across Cultures,” Barry Thatcher gives several warnings about how the culture of our audience changes their approach to documentation. Although his main example is about internal communication, the same principles apply to customer-facing documents, as reflected in the school websites that he analyzes. By knowing more about the culture of our audience, we can tailor tone and content to appropriately address an individualist vs. collectivist mindset, or universalist vs. particular understanding. I shudder sometimes to think about all the things that I ignorantly say just because my perspective is so limited. The American Marketing Association actually published “The Olympics are Coming: Lessons for Cross-Cultural Advertising” to head off some foot-in-mouth moments.
Finally, as Blakeslee alludes to, we need to understand how our audience approaches documentation differently when it’s digital. This goes directly to Katz and Rhodes discussion of six different ethical frames through which audiences might approach technology. I might seek ways to optimize electronic document delivery, seeing it as both a means and an ends. My reader who gets the document likely sees the delivery process as only a tool and having value only as a delivery mechanism. Similarly, if we approach our documents assuming a sanctity frame, we could alienate task-focused readers who have a “us and them” mindset to technology.
Technical communications doesn’t get nearly as much help in understanding our audience as T.V. shows. Instead of focus groups, we get occasional blog comments. However, I think the more we know about our audience, the more we can create content that addresses their specific context, culture, and relation to technology.
I have really enjoyed this class, and interacting with all of you on this blog. This course has helped me see my current (and future) workplace situation through different lenses, and I feel this has made me stronger professionally. I chose to write my paper on what skills technical communication professionals need to succeed in the modern/future workplace. I have pasted my abstract below, please let me know what you think!
Emerging media has completely changed the face of traditional technical writing. The introduction of Web 2.0 has created user needs that supersede the tangible printed and bound instruction manuals that previously defined the field. As a result, workplaces have established new requirements for the skills ideal technical writing candidates must possess, and universities have strategically designed programs to keep up with these trends. Successful technical writers are now faced with the tasks of interpreting the most effective structure to present information; the best terminology for particular users; the appropriate design strategies to maximize accessibility; and the optimal platforms/technology to deliver products. This paper will define modern technical communication, and highlight the essential skills and abilities required for success in the industry. This paper will be concluded with my personal experience with these dynamics as a technical communications professional in multiple workplace settings.
The skills I then listed are to:
- Understand business operations and corporate financial goals to prove their value to the workplace
- Possess the collaboration skills, and ability to work in a team environment
- Maintain a thorough familiarity with leading industry tools and trends
- Possess solid writing, composition skills, and oral communication skills
- Possess the ability to evaluate their own work performance as well as those of others
- Possess document design knowledge
- Possess the ability to execute tasks and projects with enthusiasm and to meet deadlines with little support from management
All I could think of while reading Kenichi Ishii’s article, Implications of Mobility: The Uses of Personal Communication Media in Everyday Life was, “This sounds a lot like present day American youth.” This research study was conducted between 2001-2003 in Japan, but I doubt their introverted culture had as much of an impact on their results as they’re letting on.
The article mentioned “32% of Japanese adolescents agreed with ‘I can easily start talking straight away to someone I do not know’, whereas 65% of their U.S. counterparts agreed (pg. 349)”. I understand American adolescents may be more socially skilled, but I believe this has little effect on their dependency on “mobile mail”, better known as texting.
It was also mentioned that, “Japanese youth increasingly seek to avoid conflict and friendships with deep involvement”, and that they practice “long term withdrawal from society” (pg. 349). My first reaction to this information was perhaps SMS messaging initially became more popular among Japanese adolescents than it did in the U.S. As a consequence, maybe they began seeing the negative effects of such convenient, impersonal communication sooner than we did, and had more time for it to penetrate their culture.
However, if this was the case American adolescents and youth still would have never become dependent on SMS. Especially considering their noted “superior” social abilities. I doubt dependency on SMS messaging would vary much across many cultures because it’s not a matter of cultural inclination, it’s a matter of convenience.
The contextual dimension of mobility (pg. 347) allowing non-business users freedom and privacy is in my opinion key to this situation. Convenience, privacy, and freedom from parent’s rules are what created and maintained adolescents’ interest in SMS. This reminds me of Sherry Tuttle’s warning about our desire to connect with each other on mobile devices replacing our desire to connect face to face.
This article speaks volumes about the monster mobile communication has created, and it’s even more interesting that it’s so old. Approximately 12 years later we have less control over mobile devices/communication, they take up increasingly more of our time through social media and it seems to be getting worse.
Adolescents, and students are no longer the primary users of SMS messaging; the addiction is as widely spread among adults. Many of the adolescents who grew up using social media are now young adults and its impact on their social development is an area of my personal interest. It’s also interesting the negative social effects of mobile technology were so obvious from the beginning.
It’s difficult to realize the bad habits you’re falling into while you’re in the situation, and I’m beginning to see the value of that quiet time Sherry Tuttle mentioned more than ever.
While reading Toni Ferro and Marc Zachry’s “Technical Communication Unbound: Knowledge, Work, Social Media, and Emergent Communicative Practices”, I noticed some striking similarities to my own job. This article basically analyzed technical communications professionals’ workplace usage of publicly available online systems (PAOS), and I can completely relate to their findings. The table below explains this in greater detail (pg. 16):
I’m an eCommerce Copywriter for multiple retail brands, and sites like Wikipedia, Google Docs, Skype/WebEx, and Amazon.com are literally my backbone. In order to write product descriptions, I either need a sample (which is never available), or a product description from a vendor/competitor’s site. Literally 50% of my workday is spent researching products and putting existing descriptions into my own words.
The table above mentions 60% of participants reported using Wikipedia for “learning about a topic”, and this is true for me personally as well. There are times when I’m given products for sports/hobbies I’ve never even heard of and I depend on Wikipedia to explain what they are. For example, last week I was given 100 SUP accessories to write on our company website, and had no idea what the acronym SUP even stood for. Wikipedia saved the day with a robust explanation that helped me write my product descriptions like an expert.
Google Docs is another program I couldn’t do my job without, as when writing these products, other departments like imaging and merchandising need real time visibility into our progress. Most lists of products that need copy are distributed in a Google spreadsheet, and as we complete copy, we simultaneously check products off the list for the next step that needs to be initiated by other colleagues. Google Docs is our go-to for sharing and editing documents, and its absence would make everyone’s job nearly impossible.
Ferro and Zachry went on to ask, “What is the relation between what we are designing our classes and overall curriculum to achieve, and the things students will be doing after they are with us (pg. 19)?” I had been anticipating this question from the second I read through the survey data. With the amount of rapidly changing technology we’re facing and growing increasingly dependent on, PAOS are no longer a workplace/educational distraction. I personally feel students could benefit from a course geared to helping us identify and maximize these resources. I’d even be interested in taking a course on how to create these resources.
I was also happy to see the statement in the Pedagogical Implications section, “Technical communicators today rightly express concerns about how we should teach students to write in forms that did not exist 3 years ago – and some that do not yet exist (pg. 20)”. The ability to predict, effectively navigate, and communicate in the PAOS environment can make or break an employee’s success in the workplace. Employees who can create and monitor expert Wikis, become masters of developing associations and relationships online, and internalize electronic planning/coordination are greater assets to their companies than employees with identical work knowledge/experience who lack these additional qualities. I’m very interested to see how educators will introduce this material, and how this change will reflect in the technical communication discipline.
Every once in a while, I open a product I have just bought, and feel a little nostalgic for the days of paper manuals. I guess there’s some comfort in knowing that I can seek out instructions regardless of whether I am online. The truth is, when a question does arise, it is second-nature to sit down and search the internet. And, honestly, when am I offline anyways?
I do remember the days when online help wasn’t so easy to come by. If a manual did not have an answer I needed or I didn’t understand it, I was stuck with the time-consuming tasks of doing my own research. Other times, I would come across mistakes in the instructions or information that became outdated after a software update occurred.
So while I think I “miss” the days of paper documentation accompanying products, I don’t miss all that they represent. I like that I can search for specific issues quickly. I love that outdated or inaccurate information is usually wiped away. And, it’s super convenient that customer support is often a click away, instead of requiring a call to the customer support line.
Now don’t get me wrong, I still print out a lot of the instructions that I look up in customizable searches. I do this because, in many cases, it is easier for me to follow directions on paper. (It is an annoying personality quirk of mine that costs me untold amounts of money buying ink and paper.) I also find that I often look up the same issue repeatedly. I have certain applications that I use on a regular basis. There is usually a function or two that I only use occasionally, so I find that when that rare occasion comes up, I need a refresher on how to do it.
Along with my printing habit, I like to cut and paste chunks of helpful or interesting information from help sections, and put them into a Microsoft document for future reference. I bookmark a lot of pages too. There is a problem though. This inconsistent data collection makes it very difficult to access the information. I have to search my saved documents which leaves me trying to remember if I saved it on my laptop or desktop? Hard drive or memory drive? If I bookmarked it then I have to search through all the bookmark and Chrome and Internet Explorer. This is assuming that I actually recall saving it in the first place. Often I go look up the same information again, only to notice I already had it, when I go to save it. Sigh.
The idea of being able to customize my own instructional text on a site is an incredibly exciting concept (Spilka, 2010, p.206)! I imagine all those topics that I go back to time and time again at my fingertips. No more haphazard organization of all the information I want to retain. No more wasted time looking for information, only to realize I already have it documented somewhere. Just one site to go back to, the source. Not only would all the information that I need be structured in the way that best meets my needs, but I could also add more information or remove what I no longer need. That would be the ultimate user experience!
Until that becomes widely available, I will continue to appreciate the ways that digital media is enabling writers to provide better and more targeted content. The use of digital media has not lead to a homogenized audience, but has instead given many new opportunities for writers to tap into the specific needs of the reader. They no longer have to make assumptions about the reader’s needs and can instead utilize a variety of user information absorbed from observing the user directly. In many ways, the move to greater use of online documentation, defies the image of the internet widening the distance between people. In this instance, online media allows for a greater personal connection with the audience.