The web sites implementing these tools get great functionality for free, but at “the cost” of letting Google know more about the users visiting the web pages. The Privacy Paradox is a phenomenon described, but still not fully understood by researchers. Computerworld, 2009. E-commerce and market intelligence are among them. Thus, in my opinion it is possible to argue that most aspects of the GDPR Article 7 and recital 32 are fulfilled. 2018 [cited 2018 21 September]; Available from: https://www.forbes.com/sites/lensherman/2018/04/16/why-facebook-will-never-change-its-business-model/#5946b93564a7. 12. We have also illustrated that individual users have experienced moral dissonance for a while, reflected by the Privacy Paradox, but that they are continuing to use the digital services anyway (moral neutralization?). Health personnel is required to evaluate the patient’s competency to consent to or refuse medical treatment. As of now, that is a topic to be addressed in court or at a government regulatory level. 3. In the mobile digital world we live in now, the customer generated content can even be health data typed in and generated by health apps on our mobile devices or smart watches. Chen et al [4] point to some of the data types being collected in an e-commerce, big data analytics setting, including Search and user logs, Customer transaction records and Customer generated content. We do not understand the amounts of data that is being gathered and how it is being used. It is a strong reason to question whether current practice is morally right, and that is why we are starting to see initiatives for stronger regulations in the US, and a continued focus on GDPR within the EU. The discussion will therefore be different in other parts of the world. Are we competent to give the consent? In surveillance capitalism, the traded good is personal data. Telematics and Informatics, 2017. Luc Rocher and his colleagues [10] found that they could re-identify 99,98% of all Americans by using 15 demographic attributes form different anonymized data sources. 18. GDPR Recital 32 further details the requirements for consent: “Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject's agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement.». “Surveillance capitalism is a further evolution of capitalism that follows in the old pattern of taking things that live outside the market, subordinating them to the market dynamics as commodities that can be sold and purchased, but with a dark twist.” Zuboff says in the interview. What if the data sets being traded and used to build the prediction models are anonymized data? Her claim, and the dark twist, is that the companies collect data without the users´ awareness. These are data defined as special category data within GDPR, and especially important to protect to ensure our legal and human right to privacy. Mitchell, R., What Google Knows About You. The answer is that surveillance capitalism threatens an aspect of our freedom so basic that we are not used to defending it. As already discussed, Facebook has been subject to scrutiny by the US Congress, and the news about the Cambridge Analytica scandal have been extensively covered by public press. Barth, S. and M.D.T. At this point it is interesting to revisit Shoshana Zuboff´s claim: “We are engineered into ignorance.”. Even the prediction results are considered personal data. It depends on what we as society value the most. 10. On the other hand, I will highlight the harms of simply collecting data by applying Robert Sloan and Richard Warner’s notion of “merely knowing,” which they reserve for government surveillance, to surveillance capitalist practices. Contributed Talk | Day 2 | 10:25:00 | 20 Minute Duration | GG-C. Surveillance capitalism has become the default business model for the majority of startups and apps that we use. Is it not only fair that they generate profit on our data when we get to use them for free? It is easy to imagine that the same data set would be useful input for an insurance company to set the insurance premium, for instance. Marketers remain unfazed. Rocher, L., J.M. This is legal. Take, for example, the role of Facebook in the US 2016 election where Russia is accused of backing over 3000 ads, many of which violated US federal law. Surveillance capitalism is an economic system centred around the commodification of personal data with the core purpose of profit-making. Example 2: Or imagine that a company collects user logs and user activity on a network to search for anomalies and signs of security breaches. While the benefits of Big Data have received considerable attention, it is the potential social costs of practices associated with Big Data that are of interest to us in this paper. Cross-tracking cookies is another example of a technology that allow analysis of user behavior across digital company borders. Now, seven years later the cross-tracking capabilities are even more advanced. Security and privacy breaches´ effect on goodwill seems to be an interesting future research topic. Smith, D. Zuckerberg put on back foot as House grills Facebook CEO over user tracking. And if it is not sufficiently informed then it is illegal, or…? Of course this neutralization is based on the information asymmetry, and our ignorance of what is actually going on, which Bandara and colleagues [20]pointed to. The algorithms, powered by our personal data, that predicts what we like and dislike, tend to show us biased information. 17. A confluence of trends led to the present-day reemergence of AI at a precarious time in history. But if they give consent, is it a truly informed decision? Zuboff does right in giving us her warnings. But what if multiple data sets are purchased by a company that uses AI to re-identify data based on 15 data points…. The more users and the more data a company has, the more the company and the data set is worth. Gesenhues, A. Facebook lost 15 million users? Mass surveillance is the inner logic of a “nudge” society. Users are kept in the dark and the methods used are designed to keep the users ignorant and without a real choice to say ´yes´ or ´no´ to the data collection and data treatment. This presentation will be accessible to a general audience, and it will be useful for anyone who uses the products and services offered by the largest surveillance corporations: Google, Facebook, Amazon, and Microsoft. The Privacy Paradox is the situation where people claim to be concerned about their online privacy but do very little to protect their personal data – they continue using digital services. This is the period after the Cambridge Analytica incident. Those, data cannot be used by the employer to evaluate user behavior and employer effectiveness, even though they could be strong predictors of good or poor performance. We have centered the discussions around the fact that the traded good in surveillance capitalism is personal data. Proceedings of the National Academy of Sciences, 2015. Most companies have well defined terms of use and cookie policy descriptions. Still, we have seen the emergence of the surveillance capitalism under the radar of democracy, as Zuboff rightly claims. This whole system is “ intentionally designed … Do we know what we sign up for? Would we be willing to pay for the services, and how much would we be willing to pay? However, in her article Gesenhues points to other factors. 51(8): p. 56-59. Is the patient capable of making credible choices that survives over time? Remember that users leaving the platform is bad for business in the surveillance capitalism paradigm. But the evidence, and examples that have come to governments´ attention points in a direction of stricter regulation – and points in a disfavor of moral neutralization at a societal level of status quo. Let us get back to the original question of this chapter – Is it legal? The tricky part is to balance all viewpoints listed in this chapter, and all the ethical aspects discovered by using the Navigation Wheel. How can we make informed decision in democratic processes if we are seldom exposed to alternative views and arguments to our own? We have started to see, but not yet fully understood all aspects of the surveillance economies and data driven communities. In such a case it becomes extremely important for the companies to grow large customer bases, but at the same time also to expand their data collection beyond the “partial identity” data that a user would leave at a single company’s services. What is our mood? They trade with the best intentions. Romm, T. and D. Harwell. Examples of moral dissonance has been observed in user groups by the research community the last decade. Should we not expect users to take responsibility for their actions? The European Commision[2] gives the following definition of personal data: Consequently, all data within the scope of surveillance capitalism is defined as personal data, including user generated content and meta-data. Does the competitive use of data and analytics lead to ethical problems? On a generic level, on the other hand, we can look at some of the most known companies within this space: Google, Facebook, Twitter, Amazon. Big data provides remarkable benefits, they claim, so users choose these benefits over the risks they are somewhat aware of, but do not understand. Similar to meta-data, customer generated data can also be analyzed to predict behavior and generate user profiles. In an interview with Democracy Now[1]she explains how our personal data and online behavior is collected, packed and sold as products to companies that use the data to predict our present and future behavior. The meta-data itself is not necessarily interesting. This is important, for it points to a gap in Shoshana Zuboff’s immensely (and rightly) influential recent book, “The Age of Surveillance Capitalism.” For Zuboff, the motor driving mass surveillance is capital accumulation. Hansen et al´s [8] research was also an early warning that we should be concerned. There is no need to repeat the arguments in the direction of not justifying current practice. The customer generated data is interesting to analyze in itself. In my opinion we as society is currently experiencing moral dissonance. Landau, S. and T. Moore, Economic tussles in federated identity management. Overall, it is easy to find good arguments that justifies current practices. Shoshana Zuboff has explored a phenomenon she calls Surveillance Capitalism. This is a really interesting topic that deserves ethical reflection. However, it is possible to take a more optimistic view. There is also a growing concern at government level about ´echo chamber´ effects within social medias and the data driven economy, exemplified by a piece written by Grimes [18]. Chen, H., R.H.L. In her 2019 work “The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power,” Shoshana Zuboff (Charles Edward Wilson Professor emerita at Harvard Business School) offers a revolutionary look at the history, practice, and ethics of data collection, aggregation, and monetization. Within Europe personal data processing is regulated by GDPR. This is confirmed by a study from Youyou and colleagues from University of Cambridge and Stanford [6], where they found that machine supported big data analysis of users´ Facebook likes could be used to predict user personality, and predict life-outcomes such as substance use, political attitudes, and physical health with high degree of validity. Is the patient in a position to reason and assess options and alternatives? 19(5): p. 413-437. Could not users use their market power by deselecting and avoiding companies with unethical behavior? Free UK p&p over £15, online orders only. Names, phone numbers, emails addresses and similar are erased from the data set, while the remaining data is being traded/communicated to a third party. However, even with GDPR there are ethical dilemmas: how do we interpret the privacy principle of purpose limitation in a data driven environment powered by advanced machine learning and artificial intelligence? Customers/users are being satisfied with personalized, easily available content (driven by AI). Companies interacting in the surveillance capitalism domain are huge multinationals, often based in the US or China. Nothing is black or white. Surveillance capitalism is the technological continuation of unlimited growth with unlimited data, and is already busy undermining democracy, autonomy, freedom, wealth inequality and human rights online and in areas such as Western China, against minorities. 2019 [cited 2019 22 September]; Available from: https://www.washingtonpost.com/technology/2019/09/18/facebook-google-twitter-face-fresh-heat-congress-harmful-online-content/. [3] EU has the strongest privacy regulations in the world, throgh GDPR. 121: p. 562-567. In worst case companies can be fined up to 20 million euros, or 4% of the annual global turnover. In a 2012 Special Issue on business intelligence research, Chen and his colleagues [4] explains areas where business intelligence and analytics can make major impact. they were able to go from directing ads towards customers based on key words in the text they read (e.g. Our discussion so far has been on the critical side, looking at the possible negative sides of surveillance economy or user data driven economies. 13. Mitchell [1], Sherman [2] and Chen et al. Does the patient understand the consequences of his/her choices. To order a copy go to guardianbookshop.com or call 020-3176 3837. We as users are in control of the ´customer generated data´ that we voluntarily share, but the extent of meta-data collection, such as ´user logs´ and ´transaction records´, and profile and prediction data generated about us, and collected across digital borders through big data and analytics is way beyond our imagination. Their purpose, and economy is built on the customer base, and their ability to attract and retain new customers. Høgseth, M.H. So back to surveillance capitalism. Similarly, we know that Facebook has a vast number of third-party app-providers supporting users with cool features in the social media platform. In this paper we have looked at ethical aspects related to the concept Shoshanna Zuboff calls Surveillance Capitalism. Would the company’s innovation rate and eager to attract new and retain old customers be equally high without the existing competition to attract customers to the free service platforms? Surveillance is not an intrinsic part of internet-based technologies, but as Zuboff explains, surveillance capitalists are the ones who have captured our digital technologies and repurposed them to generate profits of various nature. It may describe user properties and personal data directly, such as age, gender, and other content that users put into the digital service. Deep learning, cloud computing, processors like GPUs, and compute powerrequired to train neural networks faster — technology that’s become a cornerstone of major tech companies — fuel today’s revival. 9. I.e. Observing that surveillance capitalism undermines freedom by transforming “human experience” into “behavioral data,” Zuboff argues that it is unethical for corporations, such as Google and Facebook, to collect personal data for financial gain. Toward an Ethical Data Future The impact of “Surveillance Capitalism” and this new data economy will be tremendous, but the outlook for the future is still up in the air. We need to understand what types of data is being collected and processed, and under which circumstances, before we can go into a discussion of whether it is legal or not. The Birth of Surveillance Capitalism. Are we able to see into the future and understand how our data can be used and misused with current and future generations of machine learning and artificial intelligence? Isaak and Hanna [12]are among the many that calls for stronger regulations after the Cambridge Analytica scandal where Facebook handed over information about 87 million users, and which influenced the 2016 US election. Will this user buy from us in the future? Toward an Ethical Data Future The impact of “Surveillance Capitalism” and this new data economy will be tremendous, but the outlook for the future is still up in the air. Surveillance capitalism as a term was popularised by sociologist Shoshana Zuboff in 2014. Players within the system have the power to assign value to the goods and sell them to the market. 16. GDPR very well defines the requirements for consent, but I argue that the framework lacks one important definition: What is the required level of competency needed to make an informed decision to give a consent? How do we measure goodwill? First Monday, 2012. Even for a company like Facebook this is a substantial amount of money, however, when the stock price increases on the news – is it actually an economic penalty? From Article 9 in GDPR, we know that there are specially strict rules when it comes to special categories of personal data, including «racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation». The Houston Astros appear to have used data unethically. Audience members will be more informed about the various moral meanings of private-sector surveillance and about their own connection to such practices. The nub of Professor Zuboff’s argument is that surveillance capitalism’s target is human nature itself – with Shoshanna Zuboff calling out the ‘data business’ playbook of ‘hidden extraction mechanisms’, which she said is robbing us of the ability to fight back. Coordination and collaboration are extremely simple and convenient. Other teams and organizations have done so as well. In surveillance capitalism, those rights are taken from us without our knowledge, understanding, or consent, and used to create products designed to predict our behavior.” These products are then sold into new markets that she calls “behavioral futures markets.” At each stage, “our lives are further exposed to others without our consent.” The fourth industrial revolution is happening alongside historic income inequality and the new Gilded Age. And what if the user has consented to the data treatment? The concept of surveillance capitalism, as described by Shoshana Zuboff, arose as advertising companies, led by Google's AdWords, saw the possibilities of using personal data to target consumers more precisely. These are complex topics. From a legal and ethical perspective, can we really answer yes to these questions in the new digital era? 34(7): p. 1038-1058. And so on. We know that there are requirements and principles concerning purpose limitations, but what if users give their consent to other usage areas? Clicks, comments, transactions, and physical movements are being increasingly recorded and analyzed by Big Data processors who use this information to trace the sentiment and activities of markets and voters. The interview is a follow up on the release of her book ´The age of surveillance capitalism´, released January 2019. In my discussion on legal aspects, I have also touched upon moral questions. For each service provider we use, whether it is online shops or services, our health institution or at work, we generate digital identities with large amounts of personal data. It is closer to the process of de-identification and pseudonymization, which we can see from Table 1 is regulated by GDPR. User are inclined to enjoy immediate gratification of good services, compared to long term negative effects on their human right to privacy. GDPR is to a large extent Europe’s response to these complex challenges, but not without flaws, and where the big companies will probably look for loopholes (ref loophole ethics [5]) to continue their business as usual in the years to come. GDPR does not regulate truly anonymized data sets. Zuboff’s ambitious project arguably lays the foundation for an entirely new field of research at the intersection of big data and digital ethics. Grimes, D.R. 11. Trading anonymized data is legal, but is it morally right to trade personal data when you know the data set can be used for re-identification? Surveillance capitalism’s “means of behavioral modification” at scale erodes democracy from within because, without autonomy in action and in thought, we have little capacity for the moral judgment and critical thinking necessary for a democratic society. Both Miller and Sherman explain why: The currency we as users pay with is our personal data, and as Zuboff states, this data is being sold in a new digital marketplace. Think about the research by Youyou and colleagues [6] that I mentioned earlier: by analyzing one single data source (Facebook likes), they were able to predict our psychological profile, political attitudes, physical health and more. Hansen, M., A. Pfitzmann, and S. Steinbrecher, Identity management throughout one's whole life. I back up this claim by the facts that Shoshana Zuboff has illuminated us with her book on surveillance capitalism (2019), the EUs reaction by introducing GDPR (2018), and the quite recent US congressional hearings after the Cambridge Analytica incidents [12] [13] (2018), and the fact the US congress is starting to investigate the power of the large social media platform companies [14] (2019). In these cases, the trading of data will be illegal according to GDPR, however, often without the knowledge of the companies taking part in the trade. If a user has given consent to data processing and data analysis, does he/she actually have competence to give an informed consent? Not to speak about the search engine provided by Google. The abundance of data and low-cost processing power and storage has influenced the field of Big Data Analytics. Facebook settled on a 5 billion dollar fine [15]. Based on this fairly long discussion about legal aspects, on the question posed by the ethical navigation wheel – is it legal?, we see that there is no black or white answer. During the last decade they have provided services that are world class and free. Storey, BUSINESS INTELLIGENCE AND ANALYTICS: FROM BIG DATA TO BIG IMPACT. Ethical dilemma: The surveillance capitalism is facilitated by the technological revolution we have already mentioned: extreme data processing power, low cost storage and abundance of data which allows for big data analysis powered by machine learning and AI. 2015: Palgrave MacMillan. Instagram, which is owned by Facebook has seen an increase in users, and WhatsApp also owned by Facebook is equally popular as before. Thoughts about Surveillance Capitalism and Human Futures Marketing. These topics are extremely complex, and very little understood. On the one hand, I will comment on the ethical implications of public-private data sharing (such as the NSA retrieving data from Google) so common in the post-9/11 world. There is no doubt these companies, and their free services creates value to society. It is only the resent years we have heard about these kinds of breaches, and most incidents are still being kept as secrets by the affected companies. We are analog people living in a digital world. With this in mind, it is hard to see that the massive collection, use and trade of personal data is in conflict with their values – it is their core. The fact that anonymized data can be re-identified, and the question of whether we as users have competence to give an informed consent or not are topics for legal discussions. Chiang, and V.C. Hendrickx, and Y.-A. Further, Google has made major steps forward in the AI research and made advanced tools and algorithms open for private and professional users, free for use. If a company does the right things right, it may be possible to operate within the boundaries of the legal GDPR framework, even in a surveillance capitalism … Owning all the data all the time: aspects of Surveillance Capitalism Abstract. One would expect the stock market to react negatively, however, what happened is that the stock price increased with the news. 4. For most digital business-to-consumer (B2C) companies, such as Facebook, Google, Twitter, Amazon and the like the data subject/user consent is the legal foundation for data processing (GDPR Article 7). 1. Within the EU, processing of personal data is regulated by the General Data Protection Regulation (GDPR). There is a strong reason for spending time to get a basic understanding of the data being collected in the digital space, and a basic understanding that this data is being used to build user profiles that predicts user behavior, with a high degree of validity. Only the future will show whether we accept and neutralize the situation as a society, or not. When users actively and voluntarily use the services, is that not a sign that the companies’ actions and business processes can be justified? 13(2): p. 83-94. Isaak, J. and M.J. Hanna, User Data Privacy: Facebook, Cambridge Analytica, and Privacy Protection. de Jong, The privacy paradox – Investigating discrepancies between expressed privacy concerns and actual online behavior – A systematic literature review. As a researcher I started to get worried ten years ago and wrote a blog post on the topic back in 2012[5]. Think about our everyday digitally connected life where we use smart phone, smart watches, apps, PCs, visiting web pages, participates in social networks etc. Let us assume all trade of data within the surveillance capitalism regime is anonymized data sets.
Dilis Dried Fish, Dad Jokes About Dogs, Oversized Queen Comforters And Quilts, How Many Moons Does Pluto Have, 1991 Mustang Brake Booster, Raps To Ex,