Coming to grips with the “Internet of Things”
Posted by evelynmartens13
So, I suppose this is tangential to this week’s readings (or maybe at the heart of them), but I kept going deeper and deeper into the Internet as I studied the issues of privacy, ethics,
and problematic internet use (PIU), straying far from my topic, getting lost in all sorts of sidetracks. For example, I came across the word “paraphilia” in one article and didn’t stop to look it up, but then I came across it again. I was reading an article that mentioned the fact that the Diagnostic and Statistical Manual of Mental Disorders (DSM5) (http://www.dsm5.org/Pages/Default.aspx) was updated in 2013 and would now include PIU, which I found interesting and relevant to this week’s readings. So, I went to the DSM site and found, in fact, that “Internet and Gaming Disorders” is included in Section III, which is apparently a research section because it explains, “By listing Internet Gaming Disorder in DSM5’s Section II, APA hopes to encourage research to determine whether the condition should be added to the manual as a disorder.”
It was at this site that I saw “paraphilic” again, so I decided to do a search and spent over an hour just reading up on those. I won’t offer you a link, but you can Wikipedia it and see at a glance why I got distracted. Or perhaps I’ve just been sheltered?
Anyway, I don’t think I would qualify as one of the addicted just yet, but this is the kind of thing I worry about — getting sucked in to the Internet “black hole.” I mean, I really had to force myself to stop going everywhere willy-nilly and exert some discipline — problematic internet use? Scott Caplan makes a distinction between impulsive (lack of impulse control) and excessive (a lot) and says that what might be seen as excessive might just be what is required for a student to complete an assignment (that’s probably me, so far), whereas compulsive use is more likely to result in negative outcomes (p. 724-725).
Speaking of negative outcomes, before I started this course, I thought about Internet privacy challenges mostly in terms of social media and the fact that some people seem to lack
boundaries with regard to self-disclosure. Now, I have a much broader (and more disturbed) understanding of the privacy challenges we face, including the fact that it’s so easy to track our digital footprints. Still, like the people in this Varonis report, I do very little to protect my privacy.
Maybe there’s regulatory help on the way? According to this November 12 article from Politico (http://www.privacylives.com/politico-ftc-wading-into-internet-of-things/2013/11/14/), the Federal Trade Commission is going to start taking an interest in privacy issues because of so many everyday objects (“thermostats, toasters, and even sneakers”) that are getting connected to the Internet. Some of the more interesting ideas: pill bottles that keep track of whether you took the pill, refrigerators that tell you when the milk will expire, and forks that track how fast you eat, all of which could embed sensitive information about individual consumers that could then be inappropriately shared. This echoes Carina Paine and Adam Joinson’s concern that areas of our lives previously considered offline are now areas of privacy concern and being magnified online (p. 16).
Some trade groups are concerned that this new interest from the FTC might inhibit innovation, so it should be interesting to see if the FTC will be able to do much reigning in. By the way, when I went to retrieve the Political URL, I saw an article about “hacktivist” Jeremy Hammond getting 10 years in prison, so of course, I had to stop writing and spend another 45 minutes learning what that was all about. Oh well, I guess that’s the nature of the “Internet of Things” (that’s the name of the FTC workshop).
Finally, I found Steven Katz and Vicki Rhode’s piece, “Beyond Technical Frames of Human Relations,” a bit hard to absorb. If I understand their argument, it’s time to move beyond previous ethical frames to “human-machine” sanctity, which “recognizes the new relationship between him and and machines as whole entities” (p. 250). Call me old-fashioned (for sure!), but I don’t want to have “reciprocity” with my machines (p. 251). The authors bemoan the fact that some mechanized procedures and processes, most notably content management systems, seem to operate according to the machine’s specifications and for its own purposes rather than for people or organizations (p.235), but their proposal that we humanize our machines so that they become “you”s rather than the objects that they actually are seems to be a prescription for making the situation worse.
Did I just not understand this? Do I just need to come to grip with digital “being” and the “Internet of Things”?
This site uses Akismet to reduce spam. Learn how your comment data is processed.