Big Data and the Human Face of Privacy

Publish in

Documents

13 views

Please download to get full document.

View again

of 9
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Share
Description
Persuasive essay on the dangers that collecting vast amounts of data pose to all humans
Tags
Transcript
  Carlos García Erika Beckstrand ENGL 2010 23 June 2017 Big Data and its Human Implications Living today in a society that utilizes the internet for practically everything may also imply that there is no reason to expect privacy. Many have said that privacy died with the dawn of the digital age. That privacy is less important today than it was before and that even if we have, already, given privacy up the benefits are worth it. While it is true that the technology that we have created can result in important innovation, it is not a guarantee that we are on track to harness it responsibly. Few times, we consider the human implications of the development of digital technology. Rick Smolan, an avid advocate for the proper consideration of big data implications has noted, “Every time there's a new tool, whether it's Internet or cell phones or anything else, all these things can be used for good or evil. Technology is neutral; it dep ends on how it's used” (Smolan). A s Smolan points out it is ultimately us humans who decide how we use technology. It is unreasonable to point at phones as the direct cause of phone scams and it is also unreasonable to think that big data is responsible for the decisions we make with massive amounts of information on individuals. The Internet and big data will have, in the future, a formidable sway in how we understand humanity and the scope of the effect that our actions may have in society at every level.   The control we exert in technological advances implies that we can shape big data with ethical and humane considerations. It is necessary that we decide what safeguards we need to put in place to protect individuals and the public because we live in a world where data influences people. Abstract pieces of information are used every day to decide events that shape people’s lives. Data, or information if you will, is much more than simply ones and zeros. Information is a representation of reality and it has the capacity to  2 convey many crucial characteristics of the human experience of an individual. When individual pieces of information are combined with others they have the potential to identify, describe and affect humans.   If presented only with 19º54’57” N, 75º8’23” W   those coordinates might not mean much. On the other hand, if, for example, we discovered that those coordinates represent the current and exact location of the Twitter account @michaelbay, one could suggest that maybe Michael Bay has finally been imprisoned in Guantanamo Bay for his direction of the Transformers movies. For the purposes of this example it only takes two points of data to formulate a conjecture about the life of an individual. It is when this kind of aggregation happens that data, by itself, is able to identify us and that becomes dangerous in the world in which we live. It is not far fetched to think that consulting the physical presence of an individual’s phone at the time of a crime might be admissible evidence that not only the individual in question was present at the crime scene but also that he or she is responsible for the crime. Currently there is little done to consider the humanity inherent in the users of digital services. There are  plenty of stories about how companies violate people’s privacy. For example, if you emailed your veterinarian about the death of your beloved pet it is likely that you will begin seeing ads related to pet care and veterinarian services. According to Tania Lombrozo, an associate professor of psychology at the University of California, Berkley, “ The problem is this: The data-mining tools that glean our interests and choose our ads don't fit into the complex flow of information we've spent our lives charting and mastering. We don't have a map that tells us how a particular bit of information made it from Point A to Point B, nor the social context that gives us insight into why” (Lombrozo). In her argument, Lombrozo, suggests that, as humans, we mentally create the social structure that we know information will follow, but the structure of how our information propagates through advertisers and data collectors remains obscured to us. These tendencies to disregard the human side of the interaction can  be reduced or managed better if the people it describes plays an active role in how the information is gathered, how it is processed and for which purposes it can be obtained.    3 The structure that guides the flow of data on the internet does not need to be obscured from the eyes of the humans interacting with data services. According to Dr. Richard Mortier, a privacy researcher from Cambridge University, there are three principal characteristics that Human-Data interactions should allow to grant users dignity and control. Namely, they are legibility, agency and negotiability. (Mortier) These interactions need to consider the user as an equal to the service and these characteristics mentioned by Dr. Mortier would guarantee that these interactions happen in a plainly leveled field for both the user and the data collecting entity.   Legibility defines the ability of the users to understand what information is collected and what we are allowing the company to do with it. Often companies tend to think that making their privacy practices available is enough, and although they have met the law requirements, it is not enough for the average user. The policies need to be legible and understandable to the user. Thus, attempts to inform the user should, reasonably, be  pursued rather than releasing information about privacy in a dark corner of a rarely read document. Agency establishes the power of the user in actively manage the data they are sharing. This aspect requires that the user be able to provide, thanks to legibility, informed consent. Additionally, a user should be able to correct mistakes that have been introduced through the inference of data already collected. For example, if there is a store where a person buys only vegetables because they are fresher there, that particular store would  be inclined to think that this individual is a passionate vegan. On the other hand, if the same person buys only meat at a different store because it is more convenient location for him or she, that other store, could infer that this person is a dedicated carnivore. Each store only has a limited amount of information that, by itself, is not enough to make the inferences they do make about a person. Regardless of this inevitable uncertainty, companies make inferences about us every day without us being able to correct what they have asserted about us. Algorithms that process our information are more accurate than ever but that does not mean they are perfect. Only humans are capable of completely correcting those errors, as minimal as they might seem. Jamie Bartlett,  journalist for The Telegraph, indicates that in a survey of 1,464 UK consumers, “Nine out of ten consumers  4  believe that they should ‘be able to control what information organizations collect about me and what they use this information for’” (Bartlett, 25). Even though this is a legitimate concern by the public, agency is not achievable because of mainly two problems. One is that companies do not consider the user as the owner of the information, and thus, won’t allow the user to correct the information and second, we don’t have the prerequisite of legibility as a foundation for agency or in other words, users are unaware that information about them is being collected.   Lastly, negotiability refers to the ongoing engagement of the user in the process of creating, capturing, sharing, and analyzing the data that describes them. The interests and opinions of people change with time but very few times do companies allow for users to represent those changes in interest. This lack of the ability to correct information results in a disadvantage for the user. In the words of Mortier, Although we agree that it may well be possible to enable an ecosystem using economic value models for utilisation of personal data and marketplaces. We believe that power in the system is  —  as of 2016  —  disproportionately in favour of the data aggregators that act as brokers and mediators for users, causing the apparent downward trajectory of economic value in the information age. (Mortier) With systems that implement negotiability a user would have the ability to change their mind and renegotiate the terms that they have accepted because their perspective or themselves has changed. Humans after all are changing entities that are hard to describe purely as information.   Once these three requirements have been met, users will enjoy of a finer, and informed, control of their  privacy. It is true that there are technical difficulties that will need to be solved, the prospect of an informed and manageable state of privacy is very desirable in modern society. According to Glenn Greenwald, privacy advocate and recognized journalist, “There's a reaso n why privacy is so craved universally and instinctively. It isn't just a reflexive movement like breathing air or drinking water. The reason is that when we're in a state where we can be monitored, where we can be watched, our behavior changes dramatically. The range of  behavioral options that we consider when we think we're being watched severely reduce” (Greenwald). If we do
Related Search

Next Document

111111

We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks