- Cookie 5 7 3 – Protect Your Online Privacy Fence Panels
- Cookie 5 7 3 – Protect Your Online Privacy Fence Slats
In recent years, the usage and complexity of browser cookies has increased significantly. This largely went unnoticed by most users. To be sure, minimize this invasive monitoring by reading on. In this post we'll summarize how Cookies gather information about you. Learn how to manage web cookies to protect your online privacy. We use cookies to protect the security of your online activities, such as ipaddress cookies provided by ASUS. For example, when you purchase our products through ASUS Store, we may store your IP addresses in ASUS in order to help us verify the user who places an order on ASUS Store is same as the one who logins to ASUS Store.
Human beings value their privacy and the protection of their personalsphere of life. They value some control over who knows what aboutthem. They certainly do not want their personal information to beaccessible to just anyone at any time. But recent advances ininformation technology threaten privacy and have reduced the amount ofcontrol over personal data and open up the possibility of a range ofnegative consequences as a result of access to personal data. In thesecond half of the 20th century data protection regimeshave been put in place as a response to increasing levels ofprocessing of personal data. The 21st century has becomethe century of big data and advanced information technology (e.g.forms of deep learning), the rise of big tech companies and theplatform economy, which comes with the storage and processing ofexabytes of data.
The revelations of Edward Snowden, and more recently the CambridgeAnalytica case (Cadwalladr & Graham-Harrison 2018) havedemonstrated that worries about negative consequences are real. Thetechnical capabilities to collect, store and search large quantitiesof data concerning telephone conversations, internet searches andelectronic payment are now in place and are routinely used bygovernment agencies and corporate actors alike. The rise of China andthe large scale of use and spread of advanced digital technologies forsurveillance and control have only added to the concern of many. Forbusiness firms, personal data about customers and potential customersare now also a key asset. The scope and purpose of the personal datacentred business models of Big Tech (Google, Amazon, Facebook,Microsoft, Apple) has been described in detail by Shoshana Zuboff(2018) under the label 'surveillance capitalism'.
At the same time, the meaning and value of privacy remains the subjectof considerable controversy. The combination of increasing power ofnew technology and the declining clarity and agreement on privacy giverise to problems concerning law, policy and ethics. Many of theseconceptual debates and issues are situated in the context ofinterpretation and analysis of the General Data Protection Regulation(GDPR) that was adopted by the EU in spring 2018 as the successor ofthe EU 1995 Directives, with application far beyond the borders of theEuropean Union.
The focus of this article is on exploring the relationship betweeninformation technology and privacy. We will both illustrate thespecific threats that IT and innovations in IT pose for privacy andindicate how IT itself might be able to overcome theseprivacy concerns by being developed in ways that can be termed'privacy-sensitive', 'privacy enhancing' or 'privacyrespecting'. We will also discuss the role of emerging technologiesin the debate, and account for the way in which moral debates arethemselves affected by IT.
- 1. Conceptions of privacy and the value of privacy
- 2. The impact of information technology on privacy
- 3. How can information technology itself solve privacy concerns?
1. Conceptions of privacy and the value of privacy
Discussions about privacy are intertwined with the use of technology.The publication that began the debate about privacy in the Westernworld was occasioned by the introduction of the newspaper printingpress and photography. Samuel D. Warren and Louis Brandeis wrote theirarticle on privacy in the Harvard Law Review (Warren & Brandeis1890) partly in protest against the intrusive activities of thejournalists of those days. They argued that there is a 'right tobe left alone' based on a principle of 'inviolatepersonality'. Since the publication of that article, the debateabout privacy has been fuelled by claims regarding the right ofindividuals to determine the extent to which others have access tothem (Westin 1967) and claims regarding the right of society to knowabout individuals. Information being a cornerstone of access toindividuals, the privacy debate has co-evolved with – and inresponse to – the development of information technology. It istherefore difficult to conceive of the notions of privacy anddiscussions about data protection as separate from the way computers,the Internet, mobile computing and the many applications of thesebasic technologies have evolved.
1.1 Constitutional vs. informational privacy
Inspired by subsequent developments in U.S. law, a distinction can bemade between (1) constitutional (or decisional)privacy and (2) tort (or informational)privacy (DeCew 1997). The first refers to the freedom to makeone's own decisions without interference by others in regard tomatters seen as intimate and personal, such as the decision to usecontraceptives or to have an abortion. The second is concerned withthe interest of individuals in exercising control over access toinformation about themselves and is most often referred to as'informational privacy'. Think here, for instance, aboutinformation disclosed on Facebook or other social media. All tooeasily, such information might be beyond the control of theindividual.
Statements about privacy can be either descriptive or normative,depending on whether they are used to describe the way people definesituations and conditions of privacy and the way they value them, orare used to indicate that there ought to be constraints on the use ofinformation or information processing. These conditions or constraintstypically involve personal information regarding individuals, or waysof information processing that may affect individuals. Informationalprivacy in a normative sense refers typically to a non-absolute moralright of persons to have direct or indirect control over access to (1)information about oneself, (2) situations in which others couldacquire information about oneself, and (3) technology that can be usedto generate, process or disseminate information about oneself.
1.2 Accounts of the value of privacy
The debates about privacy are almost always revolving around newtechnology, ranging from genetics and the extensive study ofbio-markers, brain imaging, drones, wearable sensors and sensornetworks, social media, smart phones, closed circuit television, togovernment cybersecurity programs, direct marketing, RFID tags, BigData, head-mounted displays and search engines. There are basicallytwo reactions to the flood of new technology and its impact onpersonal information and privacy: the first reaction, held by manypeople in IT industry and in R&D, is that we have zero privacy inthe digital age and that there is no way we can protect it, so weshould get used to the new world and get over it (Sprenger 1999). The other reactionis that our privacy is more important than ever and that we can and wemust attempt to protect it.
In the literature on privacy, there are many competing accounts of thenature and value of privacy (Negley 1966, Rössler 2005). On one end ofthe spectrum, reductionist accounts argue that privacy claimsare really about other values and other things that matter from amoral point of view. According to these views the value of privacy isreducible to these other values or sources of value (Thomson 1975).Proposals that have been defended along these lines mention propertyrights, security, autonomy, intimacy or friendship, democracy,liberty, dignity, or utility and economic value. Reductionist accountshold that the importance of privacy should be explained and itsmeaning clarified in terms of those other values and sources of value(Westin 1967). The opposing view holds that privacy is valuable initself and its value and importance are not derived from otherconsiderations (see for a discussion Rössler 2004). Views thatconstrue privacy and the personal sphere of life as a human rightwould be an example of this non-reductionist conception.
More recently a type of privacy account has been proposed in relationto new information technology, which acknowledges that there is acluster of related moral claims underlying appeals to privacy, butmaintains that there is no single essential core of privacyconcerns. This approach is referred to as cluster accounts (DeCew1997; Solove 2006; van den Hoven 1999; Allen 2011; Nissenbaum2004).
From a descriptive perspective, a recent further addition to the body ofprivacy accounts are epistemic accounts, where the notion of privacyis analyzed primarily in terms of knowledge or other epistemic states.Having privacy means that others don't know certain privatepropositions; lacking privacy means that others do know certainprivate propositions (Blaauw 2013). An important aspect of thisconception of having privacy is that it is seen as a relation (Rubel2011; Matheson 2007; Blaauw 2013) with three argument places: asubject (S), a set of propositions (P) and a set ofindividuals (I). Here S is the subject who has (acertain degree of) privacy. P is composed of those propositionsthe subject wants to keep private (call the propositions in this set‘personal propositions'), and I is composed ofthose individuals with respect to whom S wants to keep thepersonal propositions private.
Another distinction that is useful to make is the one between aEuropean and a US American approach. A bibliometric study suggeststhat the two approaches are separate in the literature. The firstconceptualizes issues of informational privacy in terms of ‘dataprotection', the second in terms of ‘privacy'(Heersmink et al. 2011). In discussing the relationship of privacymatters with technology, the notion of data protection is mosthelpful, since it leads to a relatively clear picture of what theobject of protection is and by which technical means the data can beprotected. At the same time it invites answers to the question why thedata ought to be protected, pointing to a number of distinctive moralgrounds on the basis of which technical, legal and institutionalprotection of personal data can be justified. Informational privacy isthus recast in terms of the protection of personal data (van den Hoven2008). This account shows how Privacy, Technology and Data Protectionare related, without conflating Privacy and Data Protection.
1.3 Personal Data
Personal information or data is information or data that is linked orcan be linked to individual persons. Examples include explicitlystated characteristics such as a person‘s date of birth, sexualpreference, whereabouts, religion, but also the IP address of yourcomputer or metadata pertaining to these kinds of information. Inaddition, personal data can also be more implicit in the form ofbehavioural data, for example from social media, that can be linked toindividuals. Personal data can be contrasted with data that isconsidered sensitive, valuable or important for other reasons, such assecret recipes, financial data, or military intelligence. Data used tosecure other information, such as passwords, are not consideredhere. Although such security measures (passwords) may contribute toprivacy, their protection is only instrumental to the protection ofother (more private) information, and the quality of such securitymeasures is therefore out of the scope of our considerations here.
A relevant distinction that has been made in philosophical semanticsis that between the referential and the attributive use of descriptivelabels of persons (van den Hoven 2008). Personal data is defined inthe law as data that can be linked with a natural person. There aretwo ways in which this link can be made; a referential mode and anon-referential mode. The law is primarily concerned with the‘referential use' of descriptions or attributes, the typeof use that is made on the basis of a (possible) acquaintancerelationship of the speaker with the object of his knowledge.'The murderer of Kennedy must be insane', uttered whilepointing to him in court is an example of a referentially useddescription. This can be contrasted with descriptions that are usedattributively as in 'the murderer of Kennedy must be insane,whoever he is'. In this case, the user of the description isnot – and may never be – acquainted with the person he istalking about or intends to refer to. If the legal definition ofpersonal data is interpreted referentially, much of the data thatcould at some point in time be brought to bear on persons would beunprotected; that is, the processing of this data would not beconstrained on moral grounds related to privacy or personal sphere oflife, since it does not 'refer' to persons in a straightforwardway and therefore does not constitute 'personal data' in a strictsense.
1.4 Moral reasons for protecting personal data
The following types of moral reasons for the protection of personaldata and for providing direct or indirect control over access to thosedata by others can be distinguished (van den Hoven 2008):
- Prevention of harm: Unrestricted access by others to one‘s bankaccount, profile, social media account, cloud repositories,characteristics, and whereabouts can be used to harm the data subjectin a variety of ways.
- Informational inequality: Personal data have become commodities.Individuals are usually not in a good position to negotiate contractsabout the use of their data and do not have the means to check whetherpartners live up to the terms of the contract. Data protection laws,regulation and governance aim at establishing fair conditions fordrafting contracts about personal data transmission and exchange andproviding data subjects with checks and balances, guarantees forredress and means to monitor compliance with the terms of thecontract. Flexible pricing, price targeting and price gauging, dynamicnegotiations are typically undertaken on the basis of asymmetricalinformation and great disparities in access to information. Alsochoice modelling in marketing, micro-targeting in political campaigns,and nudging in policy implementation exploit a basic informationalinequality of principal and agent.
- Informational injustice and discrimination: Personal informationprovided in one sphere or context (for example, health care) maychange its meaning when used in another sphere or context (such ascommercial transactions) and may lead to discrimination anddisadvantages for the individual. This is related to the discussion oncontextual integrity by Nissenbaum (2004) and Walzerian spheres ofjustice (Van den Hoven 2008).
- Encroachment on moral autonomy and human dignity: Lack of privacymay expose individuals to outside forces that influence their choicesand bring them to make decisions they would not have otherwise made.Mass surveillance leads to a situation where routinely, systematically,and continuously individuals make choices and decisions because theyknow others are watching them. This affects their status as autonomousbeings and has what sometimes is described as a 'chilling effect'on them and on society. Closely related are considerations ofviolations of respect for persons and human dignity. The massiveaccumulation of data relevant to a person‘s identity (e.g. brain-computer interfaces, identity graphs, digital doubles or digital twins, analysisof the topology of one‘s social networks) may give rise to the ideathat we know a particular person since there is so much information abouther. It can be argued that being able to figure people out on thebasis of their big data constitutes an epistemic and moral immodesty(Bruynseels & Van den Hoven 2015), which fails to respect the factthat human beings are subjects with private mental states that have acertain quality that is inaccessible from an external perspective(third or second person perspective) – however detailed and accuratethat may be. Respecting privacy would then imply a recognition of thismoral phenomenology of human persons, i.e. recognising that a humanbeing is always more than advanced digital technologies candeliver.
These considerations all provide good moral reasons for limiting andconstraining access to personal data and providing individuals withcontrol over their data.
1.5 Law, regulation, and indirect control over access
Acknowledging that there are moral reasons for protecting personaldata, data protection laws are in force in almost all countries. Thebasic moral principle underlying these laws is the requirement ofinformed consent for processing by the data subject, providing thesubject (at least in principle) with control over potential negativeeffects as discussed above. Furthermore, processing of personalinformation requires that its purpose be specified, its use belimited, individuals be notified and allowed to correct inaccuracies,and the holder of the data be accountable to oversight authorities(OECD 1980). Because it is impossible to guarantee compliance of alltypes of data processing in all these areas and applications withthese rules and laws in traditional ways, so-called'privacy-enhancing technologies' (PETs) and identity managementsystems are expected to replace human oversight in many cases. Thechallenge with respect to privacy in the twenty-first century is toassure that technology is designed in such a way that it incorporatesprivacy requirements in the software, architecture, infrastructure,and work processes in a way that makes privacy violations unlikely tooccur. New generations of privacy regulations (e.g. GDPR) now requirestandardly a 'privacy by design' approach. The data ecosystems andsocio-technical systems, supply chains, organisations, includingincentive structures, business processes, and technical hardware andsoftware, training of personnel, should all be designed in such a way thatthe likelihood of privacy violations is a low as possible.
2. The impact of information technology on privacy
The debates about privacy are almost always revolving around newtechnology, ranging from genetics and the extensive study ofbio-markers, brain imaging, drones, wearable sensors and sensornetworks, social media, smart phones, closed circuit television, togovernment cybersecurity programs, direct marketing, surveillance,RFID tags, big data, head-mounted displays and search engines. Theimpact of some of these new technologies, with a particular focus oninformation technology, is discussed in this section.
2.1 Developments in information technology
'Information technology' refers to automated systems forstoring, processing, and distributing information. Typically, thisinvolves the use of computers and communication networks. The amountof information that can be stored or processed in an informationsystem depends on the technology used. The capacity of the technologyhas increased rapidly over the past decades, in accordance withMoore's law. This holds for storage capacity, processing capacity, andcommunication bandwidth. We are now capable of storing and processingdata on the exabyte level. For illustration, to store 100 exabytes ofdata on 720 MB CD-ROM discs would require a stack of them that wouldalmost reach the moon.
These developments have fundamentally changed our practices ofinformation provisioning. The rapid changes have increased the needfor careful consideration of the desirability of effects. Some evenspeak of a digital revolution as a technological leap similar to theindustrial revolution, or a digital revolution as a revolution inunderstanding human nature and the world, similar to the revolutionsof Copernicus, Darwin and Freud (Floridi 2008). In both the technicaland the epistemic sense, emphasis has been put on connectivity andinteraction. Physical space has become less important, information isubiquitous, and social relations have adapted as well.
As we have described privacy in terms of moral reasons for imposingconstraints on access to and/or use of personal information, theincreased connectivity imposed by information technology poses manyquestions. In a descriptive sense, access has increased, which, in anormative sense, requires consideration of the desirability of thisdevelopment, and evaluation of the potential for regulation bytechnology (Lessig 1999), institutions, and/or law.
As connectivity increases access to information, it also increases thepossibility for agents to act based on the new sources ofinformation. When these sources contain personal information, risks ofharm, inequality, discrimination, and loss of autonomy easily emerge.For example, your enemies may have less difficulty finding out whereyou are, users may be tempted to give up privacy for perceivedbenefits in online environments, and employers may use onlineinformation to avoid hiring certain groups of people. Furthermore,systems rather than users may decide which information is displayed,thus confronting users only with news that matches their profiles.
Although the technology operates on a device level, informationtechnology consists of a complex system of socio-technical practices,and its context of use forms the basis for discussing its role inchanging possibilities for accessing information, and therebyimpacting privacy. We will discuss some specific developments andtheir impact in the following sections.
2.2 Internet
The Internet, originally conceived in the 1960s and developed in the1980s as a scientific network for exchanging information, was notdesigned for the purpose of separating information flows (Michener1999). The World Wide Web of today was not foreseen, and neither wasthe possibility of misuse of the Internet. Social network sitesemerged for use within a community of people who knew each other inreal life – at first, mostly in academic settings – ratherthan being developed for a worldwide community of users (Ellison2007). It was assumed that sharing with close friends would not causeany harm, and privacy and security only appeared on the agenda whenthe network grew larger. This means that privacy concerns often had tobe dealt with as add-ons rather than by-design.
A major theme in the discussion of Internet privacy revolves aroundthe use of cookies (Palmer 2005). Cookies are small pieces of datathat web sites store on the user's computer, in order to enablepersonalization of the site. However, some cookies can be used totrack the user across multiple web sites (tracking cookies), enablingfor example advertisements for a product the user has recently viewedon a totally different site. Again, it is not always clear what thegenerated information is used for. Laws requiring user consent for theuse of cookies are not always successful in terms of increasing thelevel of control, as the consent requests interfere with task flows,and the user may simply click away any requests for consent (Leenes& Kosta 2015). Similarly, features of social network sitesembedded in other sites (e.g. 'like'-button) may allowthe social network site to identify the sites visited by the user(Krishnamurthy & Wills 2009).
The recent development of cloud computing increases the many privacyconcerns (Ruiter & Warnier 2011). Previously, whereas informationwould be available from the web, user data and programs would still bestored locally, preventing program vendors from having access to thedata and usage statistics. In cloud computing, both data and programsare online (in the cloud), and it is not always clear what theuser-generated and system-generated data are used for. Moreover, asdata are located elsewhere in the world, it is not even always obviouswhich law is applicable, and which authorities can demand access tothe data. Data gathered by online services and apps such as searchengines and games are of particular concern here. Which data are usedand communicated by applications (browsing history, contact lists,etc.) is not always clear, and even when it is, the only choiceavailable to the user may be not to use the application.
Some special features of Internet privacy (social media and big data)are discussed in the following sections.
2.3 Social media
Social media pose additional challenges. The question is not merelyabout the moral reasons for limiting access to information, it is alsoabout the moral reasons for limiting the invitations to usersto submit all kinds of personal information. Social network sitesinvite the user to generate more data, to increase the value of thesite ('your profile is …% complete'). Users aretempted to exchange their personal data for the benefits ofusing services, and provide both this data and their attention aspayment for the services. In addition, users may not even be aware ofwhat information they are tempted to provide, as in the aforementionedcase of the 'like'-button on other sites. Merely limitingthe access to personal information does not do justice to the issueshere, and the more fundamental question lies in steering the users'behaviour of sharing. When the service is free, the data is needed as a form of payment.
One way of limiting the temptation of users to share is requiringdefault privacy settings to be strict. Even then, this limits accessfor other users ('friends of friends'), but it does notlimit access for the service provider. Also, such restrictions limitthe value and usability of the social network sites themselves, andmay reduce positive effects of such services. A particular example ofprivacy-friendly defaults is the opt-in as opposed to the opt-outapproach. When the user has to take an explicit action to share dataor to subscribe to a service or mailing list, the resulting effectsmay be more acceptable to the user. However, much still depends on howthe choice is framed (Bellman, Johnson, & Lohse 2001).
2.4 Big data
Users generate loads of data when online. This is not only dataexplicitly entered by the user, but also numerous statistics on userbehavior: sites visited, links clicked, search terms entered, etc. Datamining can be employed to extract patterns from such data, which canthen be used to make decisions about the user. These may only affectthe online experience (advertisements shown), but, depending on whichparties have access to the information, they may also impact the userin completely different contexts.
In particular, big data may be used in profiling the user (Hildebrandt2008), creating patterns of typical combinations of user properties,which can then be used to predict interests and behavior. An innocentapplication is 'you may also like …', but,depending on the available data, more sensitive derivations may bemade, such as most probable religion or sexual preference. Thesederivations could then in turn lead to inequal treatment ordiscrimination. When a user can be assigned to a particular group,even only probabilistically, this may influence the actions taken byothers (Taylor, Floridi, & Van der Sloot 2017). For example,profiling could lead to refusal of insurance or a credit card, inwhich case profit is the main reason for discrimination. When suchdecisions are based on profiling, it may be difficult to challengethem or even find out the explanations behind them. Profiling couldalso be used by organizations or possible future governments that havediscrimination of particular groups on their political agenda, inorder to find their targets and deny them access to services, orworse.
Big data does not only emerge from Internet transactions. Similarly,data may be collected when shopping, when being recorded bysurveillance cameras in public or private spaces, or when usingsmartcard-based public transport payment systems. All these data couldbe used to profile citizens, and base decisions upon such profiles.For example, shopping data could be used to send information abouthealthy food habits to particular individuals, but again also fordecisions on insurance. According to EU data protection law,permission is needed for processing personal data, and they can onlybe processed for the purpose for which they were obtained. Specificchallenges, therefore, are (a) how to obtain permission when the userdoes not explicitly engage in a transaction (as in case ofsurveillance), and (b) how to prevent 'function creep',i.e. data being used for different purposes after they are collected(as may happen for example with DNA databases (Dahl & Sætnan2009).
One particular concern could emerge from genetics and genomic data(Tavani 2004, Bruynseels & van den Hoven, 2015). Like other data, genomics can be used to make predictions, and in particular could predict risks of diseases.Apart from others having access to detailed user profiles, afundamental question here is whether the individual should know whatis known about her. In general, users could be said to have a right toaccess any information stored about them, but in this case, there mayalso be a right not to know, in particular when knowledge of the data(e.g. risks of diseases) would reduce the well-being – by causingfear, for instance – without enabling treatment. With respect toprevious examples, one may not want to know the patterns in one's ownshopping behavior either.
2.5 Mobile devices
As users increasingly own networked devices such as smart phones,mobile devices collect and send more and more data. These devicestypically contain a range of data-generating sensors, including GPS(location), movement sensors, and cameras, and may transmit theresulting data via the Internet or other networks. One particularexample concerns location data. Many mobile devices have a GPS sensorthat registers the user's location, but even without a GPS sensor,approximate locations can be derived, for example by monitoring theavailable wireless networks. As location data links the online worldto the user's physical environment, with the potential of physicalharm (stalking, burglary during holidays, etc.), such data are oftenconsidered particularly sensitive.
Many of these devices also contain cameras which, when applicationshave access, can be used to take pictures. These can be consideredsensors as well, and the data they generate may be particularlyprivate. For sensors like cameras, it is assumed that the user isaware when they are activated, and privacy depends on such knowledge.For webcams, a light typically indicates whether the camera is on, butthis light may be manipulated by malicious software. In general,'reconfigurable technology' (Dechesne, Warnier, & vanden Hoven 2011) that handles personal data raises the question of userknowledge of the configuration.
2.6 The Internet of Things
Devices connected to the Internet are not limited to user-ownedcomputing devices like smartphones. Many devices contain chips and/orare connected in the so-called Internet of Things. RFID (radiofrequency identification) chips can be read from a limited distance,such that you can hold them in front of a reader rather than insertingthem. EU and US passports have RFID chips with protected biometricdata, but information like the user's nationality may easily leak whenattempting to read such devices (see Richter, Mostowski & Poll2008, in Other Internet Resources). 'Smart' RFIDs are alsoembedded in public transport payment systems. 'Dumb'RFIDs, basically only containing a number, appear in many kinds ofproducts as a replacement of the barcode, and for use in logistics.Still, such chips could be used to trace a person once it is knownthat he carries an item containing a chip.
In the home, there are smart meters for automatically reading andsending electricity and water consumption, and thermostats and other devicesthat can be remotely controlled by the owner. Such devices againgenerate statistics, and these can be used for mining and profiling.In the future, more and more household appliances will be connected,each generating its own information. Ambient intelligence (Brey 2005),and ubiquitous computing, along with the Internet of Things(Friedewald & Raabe 2011), also enable automatic adaptation of theenvironment to the user, based on explicit preferences and implicitobservations, and user autonomy is a central theme in considering theprivacy implications of such devices. In general, the move towards aservice-oriented provisioning of goods, with suppliers being informedabout how the products are used through IT and associatedconnectivity, requires consideration of the associated privacy andtransparency concerns (Pieters 2013). For example, users will need tobe informed when connected devices contain a microphone and how andwhen it is used.
2.7 E-Government
Government and public administration have undergone radicaltransformations as a result of the availability of advanced IT systemsas well. Examples of these changes are biometric passports, onlinee-government services, voting systems, a variety of online citizenparticipation tools and platforms or online access to recordings ofsessions of parliament and government committee meetings.
Consider the case of voting in elections. Information technology mayplay a role in different phases in the voting process, which may havedifferent impact on voter privacy. Most countries have a requirementthat elections are to be held by secret ballot, to prevent vote buyingand coercion. In this case, the voter is supposed to keep her voteprivate, even if she would want to reveal it. For informationtechnology used for casting votes, this is defined as the requirementof receipt-freeness or coercion-resistance (Delaune, Kremer & Ryan2006). In polling stations, the authorities see to it that the voterkeeps the vote private, but such surveillance is not possible whenvoting by mail or online, and it cannot even be enforced bytechnological means, as someone can always watch while the votervotes. In this case, privacy is not only a right but also a duty, andinformation technology developments play an important role in thepossibilities of the voter to fulfill this duty, as well as thepossibilities of the authorities to verify this. In a broader sense,e-democracy initiatives may change the way privacy is viewed in thepolitical process.
More generally, privacy is important in democracy to prevent undueinfluence. While lack of privacy in the voting process could enablevote buying and coercion, there are more subtle ways of influencingthe democratic process, for example through targeted (mis)informationcampaigns. Online (political) activities of citizens on for examplesocial media facilitate such attempts because of the possibility oftargeting through behavioural profiling. Compared to offline politicalactivities, it is more difficult to hide preferences and activities,breaches of confidentiality are more likely, and attempts to influenceopinions become more scalable.
2.8 Surveillance
Information technology is used for all kinds of surveillance tasks. Itcan be used to augment and extend traditional surveillance systemssuch as CCTV and other camera systems, for example to identifyspecific individuals in crowds, using face recognition techniques, orto monitor specific places for unwanted behaviour. Such approachesbecome even more powerful when combined with other techniques, such asmonitoring of Internet-of-Things devices (Motlagh et al. 2017).
Besides augmenting existing surveillance systems, ICT techniques arenowadays mainly used in the digital domain, typically grouped togetherunder the term 'surveillance capitalism' (Zuboff 2019). Socialmedia and other online systems are used to gather large amounts ofdata about individuals – either 'voluntary', because userssubscribe to a specific service (Google, Facebook), or involuntary bygathering all kinds of user related data in a less transparent manner.Data analysis and machine learning techniques are then used togenerate prediction models of individual users that can be used, forexample, for targeted advertisement, but also for more maliciousintents such as fraud or micro-targeting to influence elections(Albright 2016, Other Internet Resources) or referenda such as Brexit(Cadwalladr 2019, Other Internet Resources).
In addition to the private sector surveillance industry, governmentsform another traditional group that uses surveillance techniques at alarge scale, either by intelligence services or law enforcement. Thesetypes of surveillance systems are typically justified with an appealto the 'greater good' and protecting citizens, but theiruse is also controversial. For such systems, one would typically liketo ensure that any negative effects on privacy are proportional to thebenefits achieved by the technology. Especially since these systemsare typically shrouded in secrecy, it is difficult for outsiders tosee if such systems are used proportionally, or indeed useful fortheir tasks (Lawner 2002). This is particularly pressing whengovernments use private sector data or services for surveillancepurposes.
The almost universal use of good encryption techniquesin communication systems makes it also harder to gather effectivesurveillance information, leading to more and more calls for 'backdoors' that can exclusively be used by government in communicationsystems. From a privacy standpoint this could be evaluated asunwanted, not only because it gives governments access to privateconversations, but also because it lowers the overall security ofcommunication systems that employ this technique (Abelson et al.2015).
Cookie 5 7 3 – Protect Your Online Privacy Fence Panels
3. How can information technology itself solve privacy concerns?
Whereas information technology is typically seen as the causeof privacy problems, there are also several ways in which informationtechnology can help to solve these problems. There are rules,guidelines or best practices that can be used for designingprivacy-preserving systems. Such possibilities range fromethically-informed design methodologies to using encryption to protectpersonal information from unauthorized use. In particular, methodsfrom the field of information security, aimed at protectinginformation against unauthorized access, can play a key role in theprotection of personal data.
3.1 Design methods
Value sensitive design provides a 'theoretically groundedapproach to the design of technology that accounts for human values ina principled and comprehensive manner throughout the designprocess' (Friedman et al. 2006). It provides a set of rules andguidelines for designing a system with a certain value in mind. Onesuch value can be ‘privacy', and value sensitive designcan thus be used as a method to design privacy-friendly IT systems(Van den Hoven et al. 2015). The ‘privacy by design'approach as advocated by Cavoukian (2009) and others can be regardedas one of the value sensitive design approaches that specificallyfocuses on privacy (Warnier et al. 2015). More recently, approachessuch as 'privacy engineering' (Ceross & Simpson 2018)extend the privacy by design approach by aiming to provide a morepractical, deployable set of methods by which to achieve system-wideprivacy.
The privacy by design approach provides high-level guidelines in theform of principles for designing privacy-preserving systems.These principles have at their core that 'data protection needsto be viewed in proactive rather than reactive terms, making privacyby design preventive and not simply remedial' (Cavoukian 2010).Privacy by design's main point is that data protection should becentral in all phases of product life cycles, from initial design tooperational use and disposal (see Colesky et al. Ishowu instant advance 1 2 1 download free. 2016) for a criticalanalysis of the privacy by design approach). The Privacy ImpactAssessment approach proposed by Clarke (2009) makes a similar point.It proposes 'a systematic process for evaluating the potentialeffects on privacy of a project, initiative or proposed system orscheme' (Clarke 2009). Note that these approaches should notonly be seen as auditing approaches, but rather as a means to makeprivacy awareness and compliance an integral part of theorganizational and engineering culture.
There are also several industry guidelines that can be used to designprivacy preserving IT systems. The Payment Card Industry Data SecurityStandard (see PCI DSS v3.2, 2018, in the Other Internet Resources),for example, gives very clear guidelines for privacy and securitysensitive systems design in the domain of the credit card industry andits partners (retailers, banks). Various International Organizationfor Standardization (ISO) standards (Hone & Eloff 2002) also serveas a source of best practices and guidelines, especially with respectto information security, for the design of privacy friendly systems.Furthermore, the principles that are formed by the EU Data ProtectionDirective, which are themselves based on the Fair InformationPractices (Gellman 2014) from the early 70s – transparency,purpose, proportionality, access, transfer – are technologicallyneutral and as such can also be considered as high level ‘designprinciples'. Systems that are designed with these rules andguidelines in mind should thus – in principle – be incompliance with EU privacy laws and respect the privacy of itsusers.
Cookie 5 7 3 – Protect Your Online Privacy Fence Slats
The rules and principles described above give high-level guidance fordesigning privacy-preserving systems, but this does not mean that ifthese methodologies are followed the resulting IT system will(automatically) be privacy friendly. Some design principles are rathervague and abstract. What does it mean to make a transparent design orto design for proportionality? The principles need to be interpretedand placed in a context when designing a specific system. Butdifferent people will interpret the principles differently, which willlead to different design choices, with different effects on privacy.There is also a difference between the design and the implementationof a computer system. During the implementation phase software bugsare introduced, some of which can be exploited to break the system andextract private information. How to implement bug-free computersystems remains an open research question (Hoare 2003). In addition,implementation is another phase wherein choices and interpretationsare made: system designs can be implemented in infinitely many ways.Moreover, it is very hard to verify – for anything beyondnon-trivial systems – whether an implementation meets itsdesign/specification (Loeckx, Sieber, & Stansifer 1985). This iseven more difficult for non-functional requirements such as‘being privacy preserving' or security properties ingeneral.
Some specific solutions to privacy problems aim at increasing thelevel of awareness and consent of the user. These solutions can beseen as an attempt to apply the notion of informed consent to privacyissues with technology (Custers et al. 2018). This is connected to theidea that privacy settings and policies should be explainable to users(Pieters 2011). For example, the Privacy Coach supports customers inmaking privacy decisions when confronted with RFID tags (Broenink etal. 2010). However, users have only a limited capability of dealingwith such choices, and providing too many choices may easily lead tothe problem of moral overload (van den Hoven, Lokhorst, & Van dePoel 2012). A technical solution is support for automatic matching ofa privacy policy set by the user against policies issued by web sitesor apps.
3.2 Privacy enhancing technologies
A growing number of software tools are available that provide someform of privacy (usually anonymity) for their users, such tools arecommonly known as privacy enhancing technologies (Danezis &Gürses 2010, Other Internet Resources). Examples includecommunication-anonymizing tools such as Tor (Dingledine, Mathewson,& Syverson 2004) and Freenet (Clarke et al. 2001), andidentity-management systems for which many commercial softwarepackages exist (see below). Communication anonymizing tools allowusers to anonymously browse the web (with Tor) or anonymously sharecontent (Freenet). They employ a number of cryptographic techniquesand security protocols in order to ensure their goal of anonymouscommunication. Both systems use the property that numerous users usethe system at the same time which provides k-anonymity (Sweeney2002): no individual can be uniquely distinguished from a group ofsize k, for large values for k. Depending on the system,the value of k can vary between a few hundred to hundreds ofthousands. In Tor, messages are encrypted and routed along numerousdifferent computers, thereby obscuring the original sender of themessage (and thus providing anonymity). Similarly, in Freenet contentis stored in encrypted form from all users of the system. Since usersthemselves do not have the necessary decryption keys, they do not knowwhat kind of content is stored, by the system, on their own computer.This provides plausible deniability and privacy. The system can at anytime retrieve the encrypted content and send it to different Freenetusers.
Privacy enhancing technologies also have their downsides. For example,Tor, the tool that allows anonymized communication and browsing overthe Internet, is susceptible to an attack whereby, under certaincircumstances, the anonymity of the user is no longer guaranteed(Back, Möller, & Stiglic 2001; Evans, Dingledine, &Grothoff 2009). Freenet (and other tools) have similar problems(Douceur 2002). Note that for such attacks to work, an attacker needsto have access to large resources that in practice are only realisticfor intelligence agencies of countries. However, there are otherrisks. Configuring such software tools correctly is difficult for theaverage user, and when the tools are not correctly configuredanonymity of the user is no longer guaranteed. And there is always therisk that the computer on which the privacy-preserving software runsis infected by a Trojan horse (or other digital pest) that monitorsall communication and knows the identity of the user.
Another option for providing anonymity is the anonymization of datathrough special software. Tools exist that remove patient names andreduce age information to intervals: the age 35 is then represented asfalling in the range 30–40. The idea behind such anonymizationsoftware is that a record can no longer be linked to an individual,while the relevant parts of the data can still be used for scientificor other purposes. The problem here is that it is very hard toanonymize data in such a way that all links with an individual areremoved and the resulting anonymized data is still useful for researchpurposes. Researchers have shown that it is almost always possible toreconstruct links with individuals by using sophisticated statisticalmethods (Danezis, Diaz, & Troncoso 2007) and by combining multipledatabases (Anderson 2008) that contain personal information.Techniques such as k-anonymity might also help to generalizethe data enough to make it unfeasible to de-anonymize data (LeFevre etal. 2005).
3.3 Cryptography
Cryptography has long been used as a means to protect data, datingback to the Caesar cipher more than two thousand years ago. Moderncryptographic techniques are essential in any IT system that needs tostore (and thus protect) personal data, for example by providingsecure (confidential) connections for browsing (HTTPS) and networking(VPN). Note however that by itself cryptography does not provide anyprotection against data breaching; only when applied correctly in aspecific context does it become a ‘fence' around personaldata. In addition, cryptographic schemes that become outdated byfaster computers or new attacks may pose threats to (long-term)privacy.
Cryptography is a largefield, so any description here will be incomplete. The focus will beinstead on some newer cryptographic techniques, in particularhomomorphic encryption, that have the potential to become veryimportant for processing and searching in personal data.
Various techniques exist for searching through encrypted data (Song etal. 2000, Wang et al. 2016), which provides a form of privacyprotection (the data is encrypted) and selective access to sensitivedata. One relatively new technique that can be used for designingprivacy-preserving systems is ‘homomorphic encryption'(Gentry 2009, Acar et al. 2018). Homomorphic encryption allows a dataprocessor to process encrypted data, i.e. users could send personaldata in encrypted form and get back some useful results – forexample, recommendations of movies that online friends like – inencrypted form. The original user can then again decrypt the resultand use it without revealing any personal data to the data processor.Homomorphic encryption, for example, could be used to aggregateencrypted data thereby allowing both privacy protection and useful(anonymized) aggregate information. The technique is currently notwidely applied; there are serious performance issues if one wants toapply full homomorphic encryption to the large amounts of data storedin today's systems. However, variants of the original homomorphicencryption scheme are emerging, such as Somewhat HomomorphicEncryption (Badawi et al. 2018), that are showing promise to be morewidely applied in practice.
The main idea behind blockchain technology was first described in theseminal paper on Bitcoins (Nakamoto, n.d., Other Internet Resources).A blockchain is basically a distributed ledger that storestransactions in a non-reputable way, without the use of a trustedthird party. Cryptography is used to ensure that all transactions are'approved' by members of the blockchain and stored in such a waythat they are linked to previous transactions and cannot be removed.Although focused on data integrity and not inherently anonymous,blockchain technology enables many privacy-related applications(Yli-Huumo et al. 2016, Karame and Capkun 2018), such as anonymouscryptocurrency (Narayanan et al. 2016) and self-sovereign identity(see below).
3.4 Identity management
The use and management of user's online identifiers are crucial in thecurrent Internet and social networks. Online reputations become moreand more important, both for users and for companies. In the era ofbig data correct information about users has anincreasing monetary value.
‘Single sign on' frameworks, provided by independent thirdparties (OpenID) but also by large companies such as Facebook,Microsoft and Google (Ko et al. 2010), make it easy for users toconnect to numerous online services using a single online identity.These online identities are usually directly linked to the real world(off line) identities of individuals; indeed Facebook, Google andothers require this form of log on (den Haak 2012). Requiring a directlink between online and ‘real world' identities isproblematic from a privacy perspective, because they allow profilingof users (Benevenuto et al. 2012). Not all users will realize howlarge the amount of data is that companies gather in this manner, orhow easy it is to build a detailed profile of users. Profiling becomeseven easier if the profile information is combined with othertechniques such as implicit authentication via cookies and trackingcookies (Mayer & Mitchell 2012).
From a privacy perspective a better solution would be the use ofattribute-based authentication (Goyal et al. 2006) which allows accessof online services based on the attributes of users, for example theirfriends, nationality, age etc. Depending on the attributes used, theymight still be traced back to specific individuals, but this is nolonger crucial. In addition, users can no longer be tracked todifferent services because they can use different attributes to accessdifferent services which makes it difficult to trace online identitiesover multiple transactions, thus providing unlinkability for the user.Recently (Allen 2016, Other Internet Resources), the concept ofself-sovereign identity has emerged, which aims for users to havecomplete ownership and control about their own digital identities.Blockchain technology is used to make it possible for users to controla digital identity without the use of a traditional trusted thirdparty (Baars 2016).
4. Emerging technologies and our understanding of privacy
In the previous sections, we have outlined how current technologiesmay impact privacy, as well as how they may contribute to mitigatingundesirable effects. However, there are future and emergingtechnologies that may have an even more profound impact. Consider forexample brain-computer interfaces. In case computers are connecteddirectly to the brain, not only behavioral characteristics are subjectto privacy considerations, but even one's thoughts run the risk ofbecoming public, with decisions of others being based upon them. Inaddition, it could become possible to change one's behavior by meansof such technology. Such developments therefore require furtherconsideration of the reasons for protecting privacy. In particular,when brain processes could be influenced from the outside, autonomywould be a value to reconsider to ensure adequate protection.
Apart from evaluating information technology against current moralnorms, one also needs to consider the possibility that technologicalchanges influence the norms themselves (Boenink, Swierstra &Stemerding 2010). Technology thus does not only influence privacy bychanging the accessibility of information, but also by changing theprivacy norms themselves. For example, social networking sites inviteusers to share more information than they otherwise might. This'oversharing' becomes accepted practice within certaingroups. With future and emerging technologies, such influences canalso be expected and therefore they ought to be taken into accountwhen trying to mitigate effects.
Another fundamental question is whether, given the future (and evencurrent) level of informational connectivity, it is feasible toprotect privacy by trying to hide information from parties who may useit in undesirable ways. Gutwirth & De Hert (2008) argue that itmay be more feasible to protect privacy by transparency – byrequiring actors to justify decisions made about individuals, thusinsisting that decisions are not based on illegitimate information.This approach comes with its own problems, as it might be hard toprove that the wrong information was used for a decision. Still, itmay well happen that citizens, in turn, start data collection on thosewho collect data about them, e.g. governments. Such'counter(sur)veillance' may be used to gather informationabout the use of information, thereby improving accountability(Gürses et al. 2016). The open source movement may alsocontribute to transparency of data processing. In this context,transparency can be seen as a pro-ethical condition contributing toprivacy (Turilli & Floridi 2009).
It has been argued that the precautionary principle, well known inenvironmental ethics, might have a role in dealing with emerginginformation technologies as well (Pieters & van Cleeff 2009; Som,Hilty & Köhler 2009). The principle would see to it that theburden of proof for absence of irreversible effects of informationtechnology on society, e.g. in terms of power relations and equality,would lie with those advocating the new technology. Precaution, inthis sense, could then be used to impose restrictions at a regulatorylevel, in combination with or as an alternative to empowering users,thereby potentially contributing to the prevention of informationaloverload on the user side. Apart from general debates about thedesirable and undesirable features of the precautionary principle,challenges to it lie in its translation to social effects and socialsustainability, as well as to its application to consequences inducedby intentional actions of agents. Whereas the occurrence of naturalthreats or accidents is probabilistic in nature, those who areinterested in improper use of information behave strategically,requiring a different approach to risk (i.e. security as opposed tosafety). In addition, proponents of precaution will need to balance itwith other important principles, viz., of informed consent andautonomy.
Finally, it is appropriate to note that not all social effects ofinformation technology concern privacy (Pieters 2017). Examplesinclude the effects of social network sites on friendship, and theverifiability of results of electronic elections. Therefore,value-sensitive design approaches and impact assessments ofinformation technology should not focus on privacy only, sinceinformation technology affects many other values as well.
Bibliography
- Abelson, H., Anderson, R., Bellovin, S. M., Benaloh, J., Blaze,M., Diffie, W., & Rivest, R. L., 2015, 'Keys under doormats:mandating insecurity by requiring government access to all data andcommunications', Journal of Cybersecurity, 1(1):69–79.
- Acar, A., Aksu, H., Uluagac, A. S., & Conti, M., 2018,'A survey on homomorphic encryption schemes: Theory andimplementation', ACM Computing Surveys (CSUR), 51(4):79.
- Allen, A., 2011, Unpopular Privacy: What Must We Hide?Oxford: Oxford University Press.
- Anderson, R.J., 2008, Security Engineering: A guide tobuilding dependable distributed systems, Indianapolis, IN:Wiley.
- Baars, D., 2016, Towards Self-Sovereign Identity usingBlockchain Technology, Ph.D. Thesis, University of Twente.
- Back, A., U. Möller, & A. Stiglic, 2001, 'Trafficanalysis attacks and trade-offs in anonymity providing systems',in Information Hiding, Berlin: Springer, pp.245–257.
- Al Badawi, A., Veeravalli, B., Mun, C. F., & Aung, K. M. M.,2018, 'High-performance FV somewhat homomorphic encryption onGPUs: An implementation using CUDA', IACR Transactions onCryptographic Hardware and Embedded Systems, 2: 70–95. doi:10.13154/tches.v2018.i2.70-95
- Bellman, S., E.J. Johnson, & G.L. Lohse, 2001, 'On site:to opt-in or opt-out?: it depends on the question',Communications of the ACM, 44(2): 25–27.
- Benevenuto, F., T. Rodrigues, M. Cha, & V. Almeida, 2012,'Characterizing user navigation and interactions in onlinesocial networks', Information Sciences, 195:1–24.
- Blaauw. M.J., 2013, 'The Epistemic Account ofPrivacy', Episteme, 10(2): 167–177.
- Boenink, M., T. Swierstra, & D. Stemerding, 2010,'Anticipating the interaction between technology and morality: ascenario study of experimenting with humans inbionanotechnology', Studies in Ethics, Law, andTechnology, 4(2): 1–38. doi:10.2202/1941-6008.1098
- Brey, P., 2005, 'Freedom and privacy in ambientintelligence', Ethics and Information Technology, 7(3):157–166.
- Broenink, G., J.H. Hoepman, C.V.T. Hof, R. Van Kranenburg, D.Smits, & T. Wisman, 2010, 'The privacy coach: Supportingcustomer privacy in the internet of things', arXivpreprint 1001.4459 [available online].
- Bruynseels, K & M.J van den Hoven, 2015, 'How to doThings with personal Big Biodata', in B. Roessler and D.Mokrokinska (eds.), Social Dimensions of Privacy:Interdisciplinary Perspectives, Cambridge: Cambridge UniversityPress, pp. 122–40.
- Cadwalladr, C., and Graham-Harrison, E., 2018, 'TheCambridge analytica files', The Guardian, 21: 6–7.
- Cavoukian, A., 2009, Privacy by Design, Ottowa:Information and Privacy Commissioner of Ontario, Canada. [Cavoukian 2009 available online (PDF)].
- –––, 2010, 'Privacy by Design: TheDefinitive workshop', Identity in the InformationSociety, 3(2): 121–126.
- Ceross, A., and A. Simpson, 2018, 'Rethinking theProposition of Privacy Engineering', in Proceedings of NewSecurity Paradigms Workshop (NSPW '18, Windsor, UK), New York:Association for Computing Machinery, 89–102.doi:10.1145/3285002.3285006
- Clarke, R., 2009, 'Privacy impact assessment: Its originsand development', Computer law & security review,25(2): 123–135.
- Clarke, I., O. Sandberg, B. Wiley, & T. Hong, 2001,'Freenet: A distributed anonymous information storage andretrieval system', in Designing Privacy EnhancingTechnologies, Berlin: Springer, pp. 46–66.
- Colesky, M., J.-H. Hoepman, and C. Hillen, 2016, 'A criticalanalysis of privacy design strategies', IEEE Security andPrivacy Workshops (SPW), first online O4 August 2016,doi:10.1109/SPW.2016.23
- Custers, B., et al., 2018, 'Consent and privacy',The Routledge Handbook of the Ethics of Consent, London:Routledge, pp. 247–258.
- Dahl, J. Y., & A.R. Sætnan, 2009, 'It all happenedso slowly: On controlling function creep in forensic DNAdatabases', International journal of law, crime andjustice, 37(3): 83–103.
- Danezis, G., C. Diaz, & C. Troncoso, 2007, 'Two-sidedstatistical disclosure attack', in Proceedings of the 7thinternational conference on Privacy enhancing technologies,Berlin: Springer, pp. 30–44.
- DeCew, Judith Wagner, 1997, Pursuit of Privacy: Law, Ethics,and the Rise of Technology, Ithaca, NY: Cornell UniversityPress.
- Dechesne, F., M. Warnier, & J. van den Hoven, 2013,'Ethical requirements for reconfigurable sensor technology: achallenge for value sensitive design', Ethics andInformation Technology, 15(3): 173–181.
- Delaune, S., S. Kremer, & M. Ryan, 2006,'Coercion-resistance and receipt-freeness in electronicvoting', in the Proceedings of the 19th IEEE ComputerSecurity Foundations Workshop, IEEE Computer Society Press, pages28–39. [Delaune et al. 2006 available online]
- Dingledine, R., N. Mathewson, & P. Syverson, 2004, 'Tor:The second-generation onion router', in Proceedings of the13th conference on USENIX Security Symposium (Volume 13),Berkeley, CA: USENIX Association, pp. 303–320 [Dingledine et al. 2004 available online (pdf)]
- Douceur, J., 2002, 'The Sybil attack', inPeer-to-peer Systems, Berlin: Springer, pp.251–260.
- Ellison, N. B., 2007, 'Social network sites: Definition,history, and scholarship', Journal of Computer-MediatedCommunication, 13(1): 210–230.
- Evans, N.S., R. Dingledine, & C. Grothoff, 2009, 'Apractical congestion attack on Tor using long paths', inProceedings of the 18th conference on USENIX securitysymposium, Berkeley, CA: USENIX Association, pp. 33–50. [Evans et al. 2009 available online]
- Falgoust, M., 2016, 'Data Science and Designing forPrivacy', Techné: Research in Philosophy andTechnology, 20 (1): 51–68.
- Floridi, L., 2008, 'Artificial intelligence's new frontier:Artificial companions and the fourth revolution',Metaphilosophy, 39(4–5): 651–655.
- Friedewald, M. & O. Raabe, 2011, 'Ubiquitous computing:An overview of technology impacts', Telematics andInformatics, 28(2): 55–65.
- Friedman, B., P.H. Kahn, Jr, & A. Borning, 2006, 'Valuesensitive design and information systems', in Human-computerinteraction in management information systems: Foundations, P.Zhang & D. Galletta (eds.), Armonk: M.E. Sharp, 4.
- Gentry, C., 2009, 'Fully homomorphic encryption using ideallattices', in Proceedings of the 41st annual ACM symposiumon Theory of computing, ACM, pp. 169–178.
- Goyal, V., O. Pandey, A. Sahai, & B. Waters, 2006,'Attribute-based encryption for fine-grained access control ofencrypted data', in Proceedings of the 13th ACM conferenceon Computer and communications security, ACM, pp.89–98.
- Gürses, S., A. Kundnani, & J. Van Hoboken, 2016, 'Crypto and empire: the contradictions of counter-surveillance advocacy', Media, Culture & Society, 38(4): 576–590.
- Gutwirth, S. & P. De Hert, 2008, 'Regulating profilingin a democratic constitutional state', in Hildebrandt andGutwirth 2008: 271–302.
- den Haak, B., 2012, 'Integrating user customization andauthentication: the identity crisis', Security &Privacy, IEEE, 10(5): 82–85.
- Heersmink, R., J. van den Hoven, N.J. van Eck, & J. van denBerg, 2011. 'Bibliometric mapping of computer and informationethics', Ethics and information technology, 13(3):241–249.
- Hildebrandt, M., 2008, 'Defining Profiling: A New Type ofKnowledge?' in Hildebrandt and Gutwirth 2008: 17–45.
- Hildebrandt, M. & S. Gutwirth (eds.), 2008, Profiling theEuropean Citizen: Cross-disciplinary Perspectives, Dordrecht:Springer Netherlands.
- Hoare, T., 2003, 'The verifying compiler: A grand challengefor computing research', in Proceedings of the 12thinternational conference on Compiler construction, Berlin:Springer, pp. 262–272.
- Hone, K. & J.H.P. Eloff, 2002, 'Information securitypolicy – what do international information security standardssay?', Computers & Security, 21(5):402–409.
- van den Hoven, J., 1999, 'Privacy and the Varieties ofInformational Wrongdoing', Australian Journal ofProfessional and Applied Ethics, 1(1): 30–44.
- –––, 2008, 'Information technology,privacy, and the protection of personal data', inInformation technology and moral philosophy, J. Van Den Hovenand J. Weckert (eds.), Cambridge: Cambridge University Press, pp.301–322.
- van den Hoven, J., G.J. Lokhorst, & I. Van de Poel, 2012,'Engineering and the problem of moral overload',Science and engineering ethics, 18(1): 143–155.
- van den Hoven, J., Vermaas, P., & Van de Poel, I. (eds.),2015, Handbook of Ethics, Values and Technological Design,Dordrecht: Springer.
- Karame, G., and Capkun, S., 2018, 'Blockchain Security andPrivacy', IEEE Security Privacy, 16(4), 11–12.
- Ko, M.N., G.P. Cheek, M. Shehab, & R. Sandhu, 2010,'Social-networks connect services', Computer,43(8): 37–43.
- Krishnamurthy, B. & C.E. Wills, 2009. 'On the leakage ofpersonally identifiable information via online social networks',in Proceedings of the 2nd ACM workshop on Online socialnetworks, ACM, pp. 7–12.
- Lawner, K. J., 2002, 'Post-September 11th InternationalSurveillance Activity – A Failure of Intelligence: The EchelonInterception System & (and) the Fundamental Right to Privacy inEurope', Pace International Law Review, 14(2):435–480.
- Leenes, R., and E. Kosta, 2015, 'Taming the cookie monsterwith dutch law-a tale of regulatory failure', Computer Law& Security Review 31(3): 317–335.
- LeFevre, K., D.J. DeWitt, & R. Ramakrishnan, 2005,'Incognito: Efficient full-domain k-anonymity', inProceedings of the 2005 ACM SIGMOD international conference onManagement of data, ACM, pp. 49–60.
- Lessig, Lawrence, 1999, Code and Other Laws of Cyberspace, New York:Basic Books.
- Loeckx, J., K. Sieber, & R.D. Stansifer, 1985, Thefoundations of program verification, Chichester: John Wiley &Sons.
- Matheson, David, 2007, 'Unknowableness and InformationalPrivacy', Journal of Philosophical Research, 32:251–67.
- Mayer, J.R. & J.C. Mitchell, 2012, 'Third-party webtracking: Policy and technology', in Security and Privacy(SP) 2012 IEEE Symposium on, IEEE, pp. 413–427.
- Michener, J., 1999, 'System insecurity in the Internetage', Software, IEEE, 16(4): 62–69.
- Motlagh, N. H., Bagaa, M., & Taleb, T., 2017, 'UAV-basedIoT platform: A crowd surveillance use case', IEEECommunications Magazine, 55(2): 128–134.
- Nissenbaum, Helen, 2004, 'Privacy as ContextualIntegrity', Washington Law Review, 79:101–139.
- Narayanan, A., Bonneau, J., Felten, E., Miller, A., &Goldfeder, S., 2016, Bitcoin and cryptocurrency technologies: acomprehensive introduction, Princeton: Princeton UniversityPress.
- Negley, G., 1966, 'Philosophical Views on the Value ofPrivacy', Law and Contemporary Problems, 31:319–325.
- OECD, 1980 [2013], The OECD Privacy Framework, 2013, available in PDF; revised and expanded from the original Guidelines on theProtection of Privacy and Transborder Flows of Personal Data,Organization for Economic Co-operation and Development, [1980 version available online]
- Palmer, D.E., 2005, 'Pop-ups, cookies, and spam: toward adeeper analysis of the ethical significance of internet marketingpractices', Journal of business ethics, 58(1–3):271–280.
- Pieters, W., 2011, 'Explanation and trust: what to tell theuser in security and AI?', Ethics and informationtechnology, 13(1): 53–64.
- –––, 2013, 'On thinging things and servingservices: technological mediation and inseparable goods',Ethics and information technology, 15(3): 195–208.
- –––, 2017, 'Beyond individual-centricprivacy: Information technology in social systems' TheInformation Society, 33(5): 271–281.
- Pieters, W. & A. van Cleeff, 2009, 'The precautionaryprinciple in a world of digital dependencies',Computer, 42(6): 50–56.
- Rössler, Beate (ed.), 2004, Privacies: PhilosophicalEvaluations, Stanford, CA: Stanford University Press.
- Rössler, Beate, 2005, The value of privacy, Cambridge: Polity Press.
- –––, 2001 [2005], The Value of Privacy,Cambridge: Polity Press; original German version Der Wert desPrivaten, Frankfurt am Main: Suhrkamp Verlag, 2001.
- Rubel, Alan, 2011, 'The Particularized Judgment Account ofPrivacy', Res Publica, 17(3): 275–90.
- Ruiter, J. & M. Warnier, 2011, 'Privacy Regulations forCloud Computing: Compliance and Implementation in Theory andPractice', in Computers, Privacy and Data Protection: anElement of Choice, S. Gutwirth, Y. Poullet, P. De Hert, and R.Leenes (eds.), Dordrecht: Springer Netherlands, pp.361–376.
- Solove, D., 2006, 'A Taxonomy of Privacy',University of Pennsylvania Law Review, 154:477–564.
- Som, C., L.M. Hilty, & A.R. Köhler, 2009, 'Theprecautionary principle as a framework for a sustainable informationsociety', Journal of business ethics, 85(3):493–505.
- Song, D.X., D. Wagner, & A. Perrig, 2000, 'Practicaltechniques for searches on encrypted data', in Security andPrivacy, 2000. S&P 2000. Proceedings. 2000 IEEE Symposium on,IEEE, pp. 44–55.
- Sprenger, Polly, 1999, 'Sun on Privacy: ‘Get OverIt'', Wired. [Sprenger 1999 available online]
- Sweeney, L., 2002, 'K-anonymity: A model for protectingprivacy', International Journal of Uncertainty, Fuzzinessand Knowledge-Based Systems, 10(05): 557–570.
- Tavani, H.T., 2004, 'Genomic research and data-miningtechnology: Implications for personal privacy and informedconsent', Ethics and information technology, 6(1):15–28.
- Taylor, L., L. Floridi, and B. Van der Sloot (eds.), 2017,Group privacy: New challenges of data technologies(Philosophical Studies Series: Vol. 126), Dordrecht: Springer.
- Thomson, Judith Jarvis, 1975, 'The Right to Privacy',Philosophy and Public Affairs, 4: 295–314.
- Turilli, M. & L. Floridi, 2009, 'The ethics ofinformation transparency', Ethics and InformationTechnology, 11(2): 105–112.
- Wang, Y., Wang, J., and Chen, X., 2016, 'Secure searchableencryption: a survey', Journal of Communications andInformation Networks, 1(4): 52–65.
- Warnier, M., Dechesne, F., and Brazier, F.M.T., 2015,'Design for the Value of Privacy', in J. van den Hoven, P.Vermaas, I. van de Poel (eds.), Handbook of Ethics, Values, andTechnological Design, Dordrecht: Springer, 431–445.
- Warren, Samuel D. & Louis D. Brandeis, 1890, 'The Rightto Privacy', Harvard Law Review, 4(5): 193–220. [Warren and Brandeis 1890 available online]
- Westin, Alan F., 1967, Privacy and Freedom, New York:Atheneum.
- Yli-Huumo, J., Ko, D., Choi, S., Park, S., and Smolander, K.,2016, 'Where is current research on blockchain technology?– a systematic review', PloS One, 11(10):e0163477. doi:10.1371/journal.pone.0163477
- Zuboff, S., 2019, The age of surveillance capitalism: thefight for the future at the new frontier of power, London:Profile Books.
Academic Tools
How to cite this entry. |
Preview the PDF version of this entry at the Friends of the SEP Society. |
Look up this entry topic at the Internet Philosophy Ontology Project (InPhO). |
Enhanced bibliography for this entryat PhilPapers, with links to its database. |
Other Internet Resources
- Albright, J., 2016, 'How Trump's campaign used the new data-industrial complex to win the election', LSE, US Centre, USApp-American Politics and Policy Blog.
- Allen, C., 2016, The Path to Self-Sovereign Identity, Coindesk.
- Cadwalladr, C., 2019, Facebook's role in Brexit – and the threat to democracy. TED Talk
- Danezis, G & S. Gürses, 2010, 'A critical review of 10 years of Privacy Technology.'
- Gellman, R., 2014, 'Fair information practices: a basic history,', Version 2.12, August 3, 2014, online manuscript.
- Nakamoto, S., Bitcoin: A Peer-to-Peer Electronic Cash System, www.bitcoin.org.
- PCI DSS (= Payment Card Industry Data Security Standard), v3.2(2018), PCI DSS related documents, PCI Security Standards Council, LLC.
- Richter, H., W. Mostowski, & E. Poll, 2008, 'Fingerprinting passports', presented at NLUUG Spring Conference on Security.
- Electronic Privacy Information Center.
- European Commission, Data protection.
- US Department of State, Privacy Act.
Related Entries
computing: and moral responsibility | ethics: search engines and | information | information technology: and moral values | privacy | social networking and ethics
Copyright © 2019 by
Jeroen van den Hoven<m.j.vandenhoven@tudelft.nl>
Martijn Blaauw<M.J.Blaauw@tudelft.nl>
Wolter Pieters<W.Pieters@tudelft.nl>
Martijn Warnier<M.E.Warnier@tudelft.nl>
Cisco Systems, Inc. and its subsidiaries (collectively 'Cisco') are committed to protecting your privacy and providing you with a positive experience on our websites and while using our products and services ('Solutions').
This Privacy Statement applies to Cisco websites and Solutions that link to or reference this Privacy Statement and describes how we handle personal information and the choices available to you regarding collection, use, access, and how to update and correct your personal information. Additional information on our personal information practices may be provided in offer descriptions, privacy data sheets, or other notices provided prior to or at the time of data collection. Certain Cisco websites and Solutions may have their own privacy documentation describing how we handle personal information for those websites or Solutions specifically. To the extent a specific notice for a website or Solution conflicts with this Privacy Statement, such specific notice will control.
What is Personal Information
'Personal information' is any information that can be used to identify an individual and may include name, address, email address, phone number, login information (account number, password), social media account information, or payment card number.
It may also include device identifiers, data and telemetry (such as IP or MAC address) when such data is linked or tied to a specific person's device.
If we link other data with your personal information, we will treat that linked data as personal information.
Collection of Your Personal Information
We may collect data, including personal information, about you as you use our websites and Solutions and interact with us. We also acquire personal information from trusted third-party sources and engage third parties to collect personal information on our behalf.
We collect personal information for a variety of business reasons, such as:
- Processing your order, including payment transactions.
- Providing you with a newsletter subscription.
- Sending and managing marketing communications and preferences.
- Creating an account.
- Provisioning websites and Solutions and enabling the use of certain features.
- Personalizing, improving and enhancing user experience and Solutions.
- Providing customer service.
- Managing a job application.
- Administering online education, testing and certifications.
We and the third parties we engage may combine the information collected across our websites and Solutions from you over time with information obtained from other sources. This helps us improve accuracy and completeness and allows us to better tailor our interactions with you.
If you choose to provide Cisco with a third party's personal information (such as name, email, and phone number), you represent that you have the third party's permission to do so. Examples include forwarding reference or marketing material to a friend or sending job referrals. Third parties may unsubscribe from any future communication following the link provided in the initial message or by clicking here. In some instances, Cisco and the third parties we engage may automatically collect data through cookies, web logs, web beacons, and other similar applications. This information is used to better understand and improve the usability, performance, and effectiveness of the website or Solution and to help personalize features, content or offers for you. Please read the 'Cookies and Other Web Technologies' section below for more information.
Uses of Your Personal Information
- We may use your personal information for the purposes of operating and helping to ensure the security of our business, delivering, improving, and customizing our websites and Solutions, sending notices, marketing and other communications, and for other legitimate purposes permitted by applicable law. Some of the ways we may use personal information include:
- Delivering a Solution you have requested.
- Analyzing, supporting, and improving our websites and Solutions and user experience.
- Personalizing websites and Solutions, newsletters and other communications.
- Administering and processing your training and certification exams.
- Managing your relationship and interactions with Cisco.
- Sending communications to you, including for marketing or customer satisfaction purposes, either directly from Cisco or from our partners.
You can modify your communication preferences at any time. See Your Choices and Selecting Your Communication Preferences below.
Access to and Accuracy of Your Personal Information
- We need your help to keep your personal information accurate and up to date. We provide options to access, correct, suppress, or delete your personal information:
- You can view or edit your Cisco.com personal information and preferences online by using the Cisco Profile Management Tool.
- Some Cisco entities may act as or be considered 'data controllers.' When a Cisco entity is acting as a data controller, you can exercise your rights of access and request corrections, suppression, or deactivations under applicable data protection laws directly with that Cisco entity as described in the specific Solution documentation.
- If you need additional assistance, or help with accessing, correcting, suppressing, or deleting your personal information, please feel free to contact us directly. We make good faith efforts to honor reasonable requests to access, delete, update, suppress, or correct your data. We will respond to your request within 30 days. If we are unable to honor your request or need more time, we will provide you with an explanation.
- In certain circumstances, some Cisco entities may act as or be considered 'data processors'. When a Cisco entity is acting as a data processor and you wish to exercise your rights of access and request corrections, suppression, or deactivations, Cisco will direct you to the data controller under the applicable data protection laws.
Your Choices and Selecting Your Communication Preferences
We give you the choice of receiving a variety of information related to our business, programs, website, and Solutions. You can manage your communication preferences through the following methods:
- By following the instructions included in each promotional email from us to unsubscribe from that mailing.
- By completing and submitting this form or by contacting us via mail at: Cisco Systems, Inc., Privacy Legal Department, 170 West Tasman Dr., San Jose, CA 95134, USA. Please be sure to include your name, email address, and specific, relevant information about the material you no longer wish to receive.
- For short message services ('SMS Services'), reply 'STOP,' 'END,' or 'QUIT' to the SMS text message you have received.
These choices do not apply to service notifications or other required communications that are considered part of certain programs, websites, and Solutions, which you may receive periodically unless you stop using or cancel the use of the program, website, or Solution in accordance with its terms and conditions. With your permission, we may also share your personal information with Cisco business partners or vendors, so they may send you information about websites, programs, products or services that may be of interest to you. To opt-out of Cisco sharing with third parties for their marketing purposes, please click here.
By using our websites, Solutions, or otherwise engaging or providing personal information to us, you agree that we may communicate with you electronically regarding security, privacy and administrative issues relating to your use. For example, if we learn of a security system's breach, we may attempt to notify you electronically by posting a notice on our websites, sending an email, or otherwise contacting you.
Sharing Your Personal Information
We may share your personal information with third parties for the purposes of operating our business, delivering, improving, securing, and customizing our websites and Solutions, sending marketing and other communications related to our business, and for other legitimate purposes permitted by applicable law or otherwise with your consent.
We may share personal information in the following ways:
- Within Cisco and any of our worldwide subsidiaries for the purposes of data processing, such as marketing, business operations, security, website or Solution functionality, or storage.
- With Cisco business partners or vendors, so that they may share information with you about their products or services. To opt-out of Cisco sharing with third parties for their marketing purposes, please click here.
- With business partners, service vendors, authorized third-party agents, or contractors to provide a requested website, Solution, service or transaction. Examples include, but are not limited to: processing of orders and credit card transactions, hosting websites, hosting seminar registration, assisting with sales-related efforts or post-sales support, and providing customer support.
- In connection with, or during negotiations of, any merger, sale of company assets, consolidation or restructuring, financing, or acquisition of all or a portion of our business by or to another company.
- In response to a request for information by a competent authority or third party if we believe disclosure is in accordance with, or is otherwise required by, any applicable law, regulation or legal process.
- With law enforcement officials, government authorities, or other third parties as necessary to comply with legal process or meet national security requirements; protect the rights, property, or safety of Cisco, its business partners, you, or others; or as otherwise required by applicable law.
- In aggregated, anonymized, and/or de-identified form that cannot reasonably be used to identify you.
- If we otherwise notify you and you consent to the sharing.
Security of Your Personal Information
We intend to protect the personal information entrusted to us and treat it securely in accordance with this Privacy Statement. Cisco implements physical, administrative, and technical safeguards designed to protect your personal information from accidental or unlawful destruction, loss, alteration, unauthorized disclosure or access. We also contractually require that our suppliers protect such information from accidental or unlawful destruction, loss, alteration, unauthorized disclosure or access. The Internet, however, cannot be guaranteed to be 100% secure, and we cannot ensure or warrant the security of any personal information you provide.
Retention of Personal Information
We will retain your personal information as needed to fulfill the purposes for which it was collected. We will retain and use your personal information as necessary to comply with our business requirements, legal obligations, resolve disputes, protect our assets, and enforce our rights and agreements.
Use of Cookies and other Web Technologies
Like many websites and web-based Solutions, Cisco uses automatic data collection tools, such as cookies, embedded web links, and web beacons. These tools collect certain standard information that your browser sends to us (e.g., Internet Protocol (IP) address, MAC address, clickstream behavior, and telemetry).
These tools help make your visit to our website and Solutions easier, more efficient, and personalized. We also use the information to improve our website and Solutions and provide greater service and value.
We partner with third parties to display advertising on our website and to manage our advertising on other sites. Our third-party partners may use cookies or similar technologies in order to provide you with advertising based on your browsing activities and interests. You may opt out of this advertising; however, generic, non-personalized ads will continue to be displayed.
For more information, or if you would like to opt out of interest-based advertising, see How Cisco Uses Automatic Data Collection Tools.
To update your cookie preferences, visit the Cisco Cookie Consent Manager.
Linked Websites
We may provide links to other third-party websites and services that are outside our control and not covered by this Privacy Statement. We encourage you to review the privacy statements posted on the websites you visit.
Forums and Chat Rooms
If you participate in a discussion forum, local communities, or chat room on a Cisco website, you should be aware that the information you provide there (i.e. your public profile) will be made broadly available to others, and could be used to contact you, send you unsolicited messages, or for purposes neither Cisco nor you have control over. Also, please recognize that individual forums and chat rooms may have additional rules and conditions. Cisco is not responsible for the personal information or any other information you choose to submit in these forums. To request removal of your personal information from our blog or community forum, click here. In some cases, we may not be able to remove your personal information and if this occurs, we will let you know if we are unable to do so and why.
Children's Privacy
Cisco encourages parents and guardians to take an active role in their children's online activities. Cisco does not knowingly collect personal information from children without appropriate parental or guardian consent. If you believe that we may have collected personal information from someone under the applicable age of consent in your country without proper consent, please let us know using the methods described in the Contact Us section and we will take appropriate measures to investigate and address the issue promptly.
International Transfer, Processing and Storage of Personal Information
As Cisco is a global organization, we may transfer your personal information to Cisco in the United States of America, to any Cisco subsidiary worldwide, or to third parties and business partners as described above that are located in various countries around the world. By using our websites and Solutions or providing any personal information to us, where applicable law permits, you acknowledge and accept the transfer, processing, and storage of such information outside of your country of residence where data protection standards may be different.
Cisco safeguards and enables the global transfer of personal information in a number of ways:
APEC Privacy Certification
Cisco's global privacy program, described in this Privacy Statement, complies with the Asia Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules System (CBPRs) and Privacy Recognition for Processors (PRP). The APEC CBPR system and PRP provides a framework for organizations to ensure protection of personal information transferred among participating APEC economies. More information about the APEC Privacy Framework, CBPRs, and PRP can be found here. Our certification applies to our business processes across our global operations that process and transfer personal information to/from our affiliates around the world. To view our certifications, please visit each one's validation page by clicking on the TRUSTe seals.
EU, UK and Swiss-US Privacy Shields
Cisco Systems Inc. and its US-based subsidiaries: Acano LLC, , Broadsoft, Inc., Cisco OpenDNS LLC, Cisco Systems Capital Corporation, Cisco WebEx LLC, CliQr Technologies LLC, CloudLock LLC, Duo Security LLC, Jasper International Services LLC, Jasper Technologies LLC, Meraki LLC, Rizio, Inc. (dba Voicea) and Scientific-Atlanta LLC (collectively 'Cisco-US') participate in and have certified compliance with the EU-US and Swiss-US Privacy Shield Frameworks and Principles as set forth by the US Department of Commerce regarding the collection, use, and retention of personal information transferred from the European Union (EU), the United Kingdom (UK), and Switzerland, respectively. Cisco-US is committed to subjecting all personal information received from European Union (EU) member countries, the UK, and Switzerland, in reliance on the EU-US and Swiss-US Privacy Shield Frameworks, to the Frameworks' applicable Principles. If there is any conflict between the terms in this Privacy Statement and the Privacy Shield Principles, the Privacy Shield Principles shall govern. To learn more about these Privacy Shield Frameworks, visit the U.S. Department of Commerce's Privacy Shield site.
Cisco-US is responsible for the processing of personal information it receives, under these Privacy Shield Frameworks, and subsequently transfers to a third party acting as an agent on its behalf. Cisco-US complies with the Privacy Shield Principles for all onward transfers of personal information from the EU, the UK, and Switzerland, including the onward transfer liability provisions. In certain situations, Cisco-US may be required to disclose personal information in response to lawful requests by public authorities, including to meet national security or law enforcement requirements.
With respect to personal information received or transferred pursuant to these Privacy Shield Frameworks, Cisco-US is subject to the regulatory enforcement powers of the US Federal Trade Commission.
EU Binding Corporate Rules - Controller
Cisco's global privacy program and policies have been approved by the Dutch, Polish, Spanish, and other relevant European privacy regulators as providing adequate safeguards for the protection of privacy, fundamental rights, and freedoms of individuals for transfers of personal information protected under European law. Cisco's Binding Corporate Rules -- Controller (BCR-C) -- provide that transfers made by Cisco worldwide of European personal information benefit from adequate safeguards.
A copy of our BCR-C can be found here. More information about BCRs can be found here.
Complaint Resolution
If you have an unresolved privacy concern related to personal data processed or transferred by Cisco pursuant to the CBPRs, PRP and/or Privacy Shield or this statement that Cisco has not addressed satisfactorily, please contact (free of charge) our U.S.-based third party dispute resolution provider by clicking here. Alternatively, you can contact the data protection supervisory authority in your jurisdiction for assistance (Note, Cisco's main establishment in the EU is in the Netherlands. As such, our EU lead authority is the Dutch Autoritiet Persoonsgegevens).
Under certain conditions more fully described on the Privacy Shield website, you may be entitled to invoke binding arbitration when other dispute resolution procedures have been exhausted.
Your California Privacy Rights
California Consumer Privacy Act (CCPA)
For business purposes in the last twelve months, Cisco may have collected, used, and shared personal information about you as described in this privacy statement. Skyhill 1 1 20 – rpg survival & adventure game. Each category of data that may be used by Cisco or shared with third parties is outlined in this statement.
All individuals have the right to request access to and deletion of the information Cisco holds about them either online or by mail to Cisco Systems, Inc., Privacy Legal Department, 170 West Tasman Dr., San Jose, CA 95134, USA. In addition, California residents may submit a request by calling direct 408-906-2726 or toll free 833-774-2726 (833-PRI-CSCO). Cisco does not sell personal information without consent.
Cisco will not discriminate against you for exercising your rights under CCPA. Specifically, we will not:
- Deny access to our Solutions;
- Charge a different rate for the use of our Solutions; or
- Provide a different quality of service.
California Shine the Light
Residents of the State of California, under California Civil Code § 1798.83, have the right to request from companies conducting business in California a list of all third parties to which the company has disclosed personal information during the preceding year for direct marketing purposes. Alternatively, the law provides that if the company has a privacy policy that gives either an opt-out or opt-in choice for use of your personal information by third parties (such as advertisers) for marketing purposes, the company may instead provide you with information on how to exercise your disclosure choice options.
Cisco qualifies for the alternative option. We have a comprehensive privacy statement, and provide you with details on how you may either opt-out or opt-in to the use of your personal information by third parties for direct marketing purposes. Therefore, we are not required to maintain or disclose a list of the third parties that received your personal information for marketing purposes during the preceding year.
If you are a California resident and request information about how to exercise your third-party disclosure choices or CCPA rights, please click here.
How to Contact Us
We value your opinions. Should you have questions or comments related to this Privacy Statement, please click here.
Updates to this Cisco Privacy Statement
We may update this Privacy Statement from time to time. If we modify our Privacy Statement, we will post the revised version here, with an updated revision date. You agree to visit these pages periodically to be aware of and review any such revisions. If we make material changes to our Privacy Statement, we may also notify you by other means prior to the changes taking effect, such as by posting a notice on our websites or sending you a notification. By continuing to use our website after such revisions are in effect, you accept and agree to the revisions and to abide by them.
The Cisco Privacy Statement was revised and effective as of May 1, 2020.
Click here for the previous version of the privacy statement.
How to Contact Us
Should you have questions or comments related to this Privacy Statement, please contact us by clicking here or by sending mail to:
Chief Privacy Officer
Cisco Systems, Inc.
170 West Tasman Dr.
San Jose, CA 95134 USA
Americas Privacy Officer
Cisco Systems, Inc.
170 West Tasman Dr.
San Jose, CA 95134 USA
EMEAR Privacy Officer
Cisco Systems, Inc.
Haarlerbergweg 13-19,
1101 CH Amsterdam-Zuidoost, Netherlands
APJC Privacy Officer
Cisco Systems, Inc.
80 Pasir Panjang Road
Bldg 80, Lvl 25, Mapletree Biz City
Singapore, Singapore 117372, Singapore