Dark Potentials in Smart Environments: Provocations and Workshops to Make Better Designers By Thejus Kayanadath BA (Hons) Graphic Communication, University of South Wales, 2019 A CRITICAL AND PROCESS DOCUMENTATION THESIS PAPER SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF DESIGN EMILY CARR UNIVERSITY OF ART + DESIGN 2020 © Thejus Kayanadath, 2021 Abstract Manipulative and intrusive technologies are directly or indirectly responsible for many significant issues faced today, including misinformation, prejudice and violence, and issues with health and wellbeing. Content platforms show hyper-targeted content that reinforces people’s existing biases sometimes leading them towards harmful, extremist views on society and politics (Chakrabarti). Manipulative interfaces can make digital products extremely addictive, using psychological tricks to keep people engaged and encourage repeated usage (Harris). Many of these persuasive and privacy-intrusive patterns appear in technology-enhanced physical spaces, or 'smart environments'. The extensive use of facial recognition, tracking of people in shared spaces, and persuasive advertisements and interactions bring issues of ethics to smart environments (Lin). The potential for technologies to be harmful is often overlooked or undervalued by designers and technologists in general. This is evidenced by opinions from designers of addictive social media applications, as they move towards curtailing or avoiding their use of these media (Breland). As smart technologies grow prevalent in people’s public and private environments, it is increasingly important for technologists to build an understanding of the harmful possibilities enabled by their work and products. This thesis aims to explore manipulative and intrusive possibilities in technology-enhanced spaces. It follows a methodology of provocation-led workshops to help designers become familiar with technologies in these spaces. Through these workshops, participants are encouraged to identify potential for benefit and harm in scenarios they create, and become familiar with terminology that can help identify and communicate this potential in their work going forward. The outcomes of these workshops contribute to a growing online collection of examples of manipulative and intrusive smart environments. This resource is intended to further build awareness and understanding around these issues, and serve as a reference for technologists, researchers, policy-makers and other interested parties in the future. Acknowledgement I owe the greatest gratitude to my supervisor, Haig Armen, who has provided knowledge, encouragement, and criticism through each stage of this research, and most importantly, has been and continues to be a true believer in the critical issues discussed in this thesis. This thesis would not be possible without the motivation, interest, and support from the many research participants, peers, mentors and advisors that I have had the honour of working with over the course of this research. I would also like to thank my advisor Garnet Hertz, who has in many instances looked at this research through perspectives unseen by anyone else. Finally, for providing companionship and spaces to truly express myself during the last two years, I would like to thank Alija Sule, April Nguyen, Binoodha Sasi, Khushboo Vansia and Nikoo Farvardin. Table of Contents Abstract ...........................................................................................................................2 Acknowledgement ..........................................................................................................3 Table of Contents ............................................................................................................4 1. Context ........................................................................................................................7 1.1. Behavioural Psychology ..................................................................................................7 1.1.1. Behavioural Psychology in Interaction Design .......................................................................8 1.2. Surveillance Capitalism ...................................................................................................9 1.3. Ubiquitous Computing ...................................................................................................10 1.4. Concerns about technology ethics .................................................................................11 1.5. Research Questions ......................................................................................................12 1.6. Rationale .......................................................................................................................12 1.7. Research Scope ............................................................................................................13 2. Common Methods in Ubiquitous Computing and Ethics Research ....................13 2.1. Design Fiction ................................................................................................................14 2.2. Card Supported Activities ..............................................................................................15 2.3. Collections of Examples ................................................................................................15 3. Methods .....................................................................................................................16 3.1. Prototyping an unethical Voice Assistant TOS document ..............................................17 3.2. Conversations with Researchers and Experts ...............................................................18 3.3. Manipulative interfaces in a retail environment .............................................................21 3.3.1. Provocative prototypes for dark patterns in a retail store .....................................................22 3.3.2. Workshop 1 - Provoking to stimulate unethical concepts .....................................................25 3.4. Hostile Architecture as a space for manipulative interaction design ..............................27 3.4.1. Workshops 2 - Card Activities and Moments of Reflection ..................................................28 3.4.2. Reflecting and Flipping Ethics ..............................................................................................31 3.5. Collecting and Coding Scenarios ..................................................................................32 3.5.1 Website .................................................................................................................................34 4. Analysis and Reflection ...........................................................................................37 4.1. Effective workshop support through provocation ...........................................................37 4.2. ’Ethics Flips’ at workshops helped identify potential guidelines or patterns for ethical technologies. ........................................................................................................................38 4.3. Demonstrating a provocative prototype when running workshops influenced the scenarios the participants created. .......................................................................................38 4.4. Modifying the workshop format for audience experience levels ....................................39 4.3. Desensitization ..............................................................................................................39 4.4. Further and Alternative Directions .................................................................................40 4.4.1. Workshops for education .....................................................................................................41 4.4.2. Workshops for firms .............................................................................................................42 4.4.3. The general public audience ................................................................................................42 4.4.4. Website as a point of contact ...............................................................................................43 5. Discussion .................................................................................................................43 5.1. Consent in Pervasive, Invasive Environments ..............................................................43 5.2. Identifiers and Persuaders .............................................................................................44 5.2.1. Identification and Persuasion ...............................................................................................44 5.2.2. Identifiers and Persuaders in Smart Environments ..............................................................45 5.3. Light Patterns and Light Potential ..................................................................................47 5.4. Positive Developments ..................................................................................................48 5.5. Future Directions ...........................................................................................................48 6. Closing Remarks ......................................................................................................49 7. References ................................................................................................................50 1. Context This thesis discusses various issues regarding the ethics of smart-environment technologies. Specifically, these include large-scale collection and abuse of personal data, manipulative and deceptive interaction design patterns, and issues around transparency and consent. Firms creating consumer technologies often collect personal and behavioural data at an unrestricted scale. This data is used to identify and target individuals for advertising potential, political information, news targeting and personalization, and more (Christl). Firms also commonly use persuasive design patterns to prompt people to behave in ways that benefit the firm. These patterns make use of theories in behaviour economics and cognitive biases to manipulate user behaviour. For example, addictive interaction patterns commonly use theories in the psychology of portion control. This results in features such as ‘Autoplay’ that remove opportunities for people to pause and reflect on time spent using that technology (Harris). Often referred to as ‘dark patterns’, these have a negative impact on end users, who face issues including a loss of time or resources, addiction to technologies, and difficulties with evaluating and impulsively sharing misinformation (Brignull, Dark Patterns). Together, these approaches enable firms to use detailed information on people and their surrounding contexts to place them in manipulative, exploitative systems. Overall, this creates issues with people’s wellbeing and personal agency, and the trustworthiness and stability of social and political systems (Christl). 1.1. Behavioural Psychology The use of behavioural psychology is popular outside of a technology context - as a way to create more effective public services and policies. The ‘Behavioural Insights Team’ set up by the UK Government, for example, brought about a number of governmental programs and policies. These include automatic enrolment for employee pension schemes, a policy where the UK government modified the choice architecture of pension scheme options, and 'switched the default from one in which employees had to actively choose to sign up for a pension scheme (‘opt in’) to one in which they are automatically enrolled onto workplace pension schemes but can choose to opt-out if they so desire (‘opt-out’)’ (Behavioural Insights Team). The team uses behavioural research indicating that opt-out defaults bring about a dramatic increase in enrolment for pension schemes, resulting in a policy that increased saving rates from 61 to 83% (Sunstein, pg. 9). In the USA, the White House Social and Behavioral Sciences Team has worked on projects that help "more students to go to college, more veterans to take advantage of education and job-training benefits, more farmers to obtain loans,” and more (Sunstein, pg. 8). The outcomes of behaviour manipulation, in these cases, can be considered positive and beneficial. Increased cessation of smoking, for example, can have positive effects on public health both on individual and societal scales. Behavioural psychology is also commonly used in marketing contexts. For example, in ‘Predictably Irrational’, Ariely (pg. 49) highlights marketing techniques that use concepts of ‘free’ items to create irrational behaviour in consumers. 1.1.1. Behavioural Psychology in Interaction Design In interaction design, behavioural psychology has been used to create more effective and engaging experiences for technology users. Books such as ‘Hooked’ (Eyal) and Seductive Interaction Design (Anderson) offer designers advice on creating software and other interactive experiences that use cognitive biases and models to create engagement. In Hooked, Eyal describes habit formation as an effective strategy for digital products: “Instead of relying on expensive marketing, habit-forming companies link their service to the users’ daily routines and emotions. A habit is at work when users feel a tad bored and instantly open Twitter. They feel a pang of loneliness and before rational thought occurs, they are scrolling through their Facebook feeds.” Eyal also describes directly usable models for creating behaviour-forming products, referring to theories from persuasion researcher Dr BJ Fogg that offer three requirements for creating behaviours: “(1) the user must have sufficient motivation; (2) the user must have the ability to complete the desired action; and (3) a trigger must be present to activate the behavior.” (Eyal) These models allow designers to create products that satisfy these requirements - for example, by creating ‘triggers’ in their applications that activate desired behaviours. Behavioural psychology is, in many cases, used by interactive products to create beneficial outcomes for users. Gewirtz describes using the fitness tracking features on their Apple Watch to stay fit more effectively. The wearable device offers a set of digital ‘rings’ that are filled up when the wearer performs physical activities, a mechanism that encourages the user to exercise to fill up their daily rings. The author reports a significant increase in consistency of their exercise habits, supported by the persuasive features on their tracking application (Gewirtz). These persuasive design techniques are often used to create negative outcomes for users, in the form of ‘Dark Patterns’. For example, pre-selected default options are often used to persuade users to agree to privacy-intrusive terms and conditions (Pot) and privacy-intrusive settings such as location tracking (Gibbs) In one instance, the social media service LinkedIn collected email addresses from users’ personal email accounts, and persuaded users into sending sign-up invitations to these addresses. This resulted in many email users receiving spam emails under the guise of personal invitations, a practice that led to LinkedIn being sued and paying $13 million in a court settlement (Nicks). 1.2. Surveillance Capitalism In addition to - and often supported by - persuasive dark patterns, firms often collect and analyze personal and behavioural data from users, for their economic benefit. This practice, referred to as Surveillance Capitalism, allows firms to create profiles about their users, enabling them to sell aggregate personal information and create further, personalized manipulative interactions (Zuboff). Data is collected from various sources, including web and mobile usage behaviour, sensors on personal and shared devices such as mobile phones and Internet of Things devices, and data from financial service providers and other services (Christl). A concerning outcome of extensive data collection is the possibility to influence public opinion at scale. For example, the 2016 election campaign for US ex-president Donald Trump made extensive use of Facebook advertising, which uses personal profiles of Facebook users to hyper-target advertisements based on their interests, life events and online behaviours. The effectiveness of hyper-targeting is demonstrated by the winning Trump campaign running 5.9 million ads on Facebook, compared with competitor Hillary Clinton running only 66,000 ads (Madrigal and Bogost). Hyper-targeting also allows online services to offer users personalized content, a concept often combined with persuasive interaction design and choice architecture patterns such as ‘autoplay’. This can lead to users being led to increasingly radicalizing content, as service algorithms learn from users’ interests and persuade them to consume more engaging content about these interests. This form of data-supported persuasion enables radical content about topics such as anti-vaccination (Wong), politics and white supremacy to grow popular among targeted audiences (Friedersdorf). This process that can result in incidents such as mosque shootings in Christchurch, New Zealand by a white supremacist radicalized by YouTube content (Lopatto) Further, a risk posed by extensive data collection and surveillance is a ‘social cooling’ effect, where people’s conscious or subconscious awareness of being 'watched' prevents them from fully expressing themselves or taking risks (Schep) - which can be particularly problematic in scenarios such as exploitative work or political environments, where people can be discouraged from speaking out against malpractices and organizing against authorities. 1.3. Ubiquitous Computing Consumer technologies increasingly use contextual information that is collected without any conscious, direct effort on part of the user. They may also display information to and interact with users using context-aware mediums that can appear ‘invisible’ to users. This invisible and pervasive form of computation was originally conceptualized as ‘Ubiquitous Computing’ by Mark Weiser in ‘The Computer for the 21st Century’, as a contrast to ‘desktop computing’, where computing takes place at a certain location and device, and occupies the full attention of the user. Initial motivation for Ubiquitous Computing systems was to allow for ‘calm technologies’, that use subtle mediums of interaction to create a simplified and non-distracting experience for users (Weiser). Ubiquitous Computing, or ‘UbiComp’, is found in, for example, automated vehicles that use spatial awareness and require limited attention from passengers - which are steadily gaining approval for public and commercial use (Hawkins). It is also part of developments in automated, cashier-less retail stores, where sensors and camera systems allow customers to pick up items and leave the store without checking out (Statt). ‘Smart home’ technologies are another implementation of these concepts, as in an example of smart thermostats that use human presence and user preferences to control the temperature of a home. However, a push towards behaviour manipulation and surveillance capitalism has led to mobile phone experiences being far from ‘calm’. Contextual information that is collected without user action (such as location, or user activity at certain times) can also be used without explicit consent, and is commonly used to increase user engagement and cause addictive behaviours, and to promote and advertise in ways that distract and occupy a user’s attention (Zuboff). This use of Ubiquitous Computing platforms to manipulate a user’s behaviour and attention is increasingly found outside of the context of mobile phones, in the context of technologyenhanced ‘smart environments’. For example, facial recognition is used to identify and profile customers at certain retail stores and show advertisements targeted to those customers (Gillespie). In 2019 protests in Hong Kong, passenger data collected through smart transit cards was used by authorities to track protestor locations, creating a deterring effect on protesters (SCMP Reporters). 1.4. Concerns about technology ethics Concerns about ethics in technologies are regularly raised in media and politics (Moss and Metcalf). Generally, technologies are regulated by governments using policy and laws, including Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) and the General Data Protection Regulation (GDPR) in the European Union. The use of persuasive technologies in the context of commerce is regulated in the EU through the Consumer Rights Directive, which specifically bans the use of certain persuasive interaction design patterns (Brignull, Some Dark Patterns now illegal). These initiatives are commonly developed in response to public concerns. In the space of smart environments, public and political pushback lead to the curtailing of the Sidewalk Labs smart-city project in Toronto. This project was eventually cancelled after facing governmental approval issues (Carter et al.). Increasingly frequent bans on police usage of facial-recognition technologies in various cities is another indication of public and political disapproval of ubiquitous computing systems surrounding issues of ethics (Moon). Social discussion and political debate and actions are supported by scholarly work in this space. This is seen in the form of informative discussion and analysis of technologies, such as reports on privacy practices in various firms (Forbrukerrådet), informative collections of manipulative design ‘dark’ patterns (Brignull, Dark Patterns) and speculative and provocative design work (Nicenboim). The EU Consumer Rights Directive, for instance, references design patterns initially discussed by Brignull in their darkpatterns.org project. 1.5. Research Questions Ubiquitous computing systems provide unique sources of data and mediums of interaction which offer opportunities for unethical interaction design - bringing up research questions about what these opportunities are. How can technologists leverage these opportunities to implement design patterns that can harm or negatively impact people? Further, how can unethical design patterns be adapted to effectively use these opportunities? This question is dependent on further questions of how ethics is defined. What makes a technology unethical? What are the roles of the individuals and groups affected by technology in defining its ethics? 1.6. Rationale Academic research around technology ethics has previously provided foundational material for further research, for conversations in public, and for building policy that regulates technologies. The growing viability of ambient technologies, and growing concerns around privacy and user agency provide rationale for continual research in this space. The processes of understanding, using, and curtailing the usage of design patterns is made simpler when they are reconsidered and applied to new and different contexts. Research into the interaction patterns used by unethical technologies in new contexts is not only helpful for understanding their application in those contexts, but for understanding patterns themselves. This presents rationale for further research, to add to a growing body of knowledge in this space. 1.7. Research Scope The variety of audiences in this space creates challenges for research and design work. Audiences range from general public, technologists in industry, and policymakers and related groups. These audiences hold differing goals, respond to different stimuli and incentives, and have varying levels of interest and value towards these issues. The scope of this research is limited to gaining an understanding of and identifying potential design opportunities that can effectively help resolve these issues across these audience groups. The scope of this research is also limited to the practice of interaction and interface design within the context of smart environments. Research into business practices, theories in behaviour economics, ethics in non-ambient technologies and other related areas are researched and discussed according to their relevance to this scope. In the space of Ubiquitous Computing and technology ethics, a few common methods to explore possibilities and consequences include creating speculative and provocative prototypes, collaboratively working on conceptual design in the form of workshops, and collecting and disseminating technologies and interaction design case studies to understand their ethics. A combination of these methods is used to speculate, collaborate and disseminate possibilities for unethical smart environments. 2. Common Methods in Ubiquitous Computing and Ethics Research Researchers in Ubiquitous Computing may attempt to envision and create prototypes that demonstrate the experience of using a ubiquitous system. These ‘Experience Prototypes’ may or may not be technically functional, but are intended to simulate the experience of using a functional system (Buchenau and Suri). Interactive prototypes are also used in research and design work meant to raise questions about technology ethics. Cenydd Bowles refers to ‘Provocatypes’, or designed prototype artifacts that are meant to provoke critical conversations about the societal consequences of technologies (Bowles). A number of such provocative prototypes were created by Iohanna Nicenboim, to explore the possibilities for digitally enhanced household appliances to collect data and conduct ‘research’ on the user. The project explores "the social and ethical implications of autonomous experimentation”, referring to techniques used by digital products to identify profitable opportunities within a user’s behaviour and activities. One example is that of a toaster that conducts A/B tests, a technique involving showing different users different variants of content or design patterns to identify the most effective variant. Iohanna’s methods involved identifying issues with current digital business practices being implemented in physical objects, and creating speculative prototypes that demonstrate these issues. This method is effective in highlighting potential issues with near-future technologies as technologists continue business practices into the foreseeable future. 2.1. Design Fiction Similar ideas are explored in the ‘design fiction’ methodology conceptualized by Bleecker et. al. In one project, ‘QUICK START GUIDE’, Bleecker et al., as part of the Near Future Laboratory, created a speculative guidebook for self-driving cars, raising questions about their potential impact on the lifestyles of their users. This project included a workshop where participants collectively and collaboratively work on creating speculative scenarios. A workshop element allows participants to learn and have conversations about speculative futures through the process of making, more so than they would have by simply experiencing the prototype (Near Future Laboratory). This project shows the viability of speculative design workshops as a method to create awareness and understanding about possible futures and the ethical implications of technologies. 2.2. Card Supported Activities This collaborative workshop element is further explored in participatory workshop projects supported by prompts or game-like structures. In ‘Reimagining the Now’, Russel, Hertz and Badke (Russell et al.) employ a series of cards to prompt workshop participants to generate speculative scenarios. Participants work with sets of cards containing the ‘elements’ of a scenario (such as technological, political, and social contexts) and are prompted to collaboratively create scenarios that reimagine current technological contexts with differing cultural values. 2.3. Collections of Examples Another approach to researching technology ethics - including within the medium of UbiComp is the practice of collecting and coding examples of privacy-intrusive or manipulative technologies. The ‘Dark Patterns’ website (Brignull, Dark Patterns) contains a number of examples of webbased dark patterns. The website popularized the term ‘dark patterns’, and helped bring awareness of the possibilities for websites and apps to use cognitive biases to create more desirable or profitable behaviours in users (Simonite). Brignull has been referred or interviewed by publications including the New York Times (Singer) and the # darkpatterns hashtag on Twitter is frequently used by users to showcase examples of dark patterns they find. In 2014, the EU Consumer Rights Directive came into use, regulating digital e-commerce products. The regulations included several bans on specific dark patterns related to ecommerce, demonstrating the viability of this method of collecting and categorizing examples of manipulative design (Brignull, Some Dark Patterns now illegal). The Dark Patterns website is structured around collecting examples of interfaces that use similar techniques, or ‘patterns’, that are given names such as ‘Sneak into Basket’ or ‘Hidden Costs’. The pattern is described, followed by a study of examples of the pattern as implemented in existing websites or apps. Numerous images highlight where the pattern appears and how it looks, helping readers visually identify these patterns. Dark patterns were explored in the context of Ubiquitous Computing by Saul Greenberg et al, specifically in the domain of ‘proxemic interactions’. Computing that uses the opportunities offered by proxemic awareness (sensing the environments, people and objects within a space) allow for unique dark patterns. Their paper, Dark Patterns in Proxemic Interactions, follows a similar structure of defining patterns and explaining their implementations using examples (Greenberg et al). Due to the lack of commercial, publicly available projects that demonstrate certain dark patterns, the paper features numerous speculative or conceptual projects. These projects and methods are some of the possible approaches for research in the spaces of ubiquitous computing and technology ethics. An amalgamation of these methodologies and methods is used in this thesis. 3. Methods This thesis primarily features a methodology of provocation-led workshops. A number of ‘provocatypes’, or provocative prototypes are created to explore and demonstrate technologies with disputable ethics. Participants at a number of workshops interact with these prototypes, gaining an embodied, experiential understanding of their functionality and interactivity. Participants are prompted, with the help of a card-based activity, to envision and brainstorm speculative ethical/unethical smart environment scenarios. These concepts are collected, and with other speculative and real-world scenarios, published on an online collective resource intended to build awareness and highlight manipulative and intrusive possibilities in smart environment design. The research was supported by numerous conversations with researchers and other individuals involved with technology ethics. 3.1. Prototyping an unethical Voice Assistant TOS document An initial speculative prototype created for this thesis was a smart voice assistant demo, where users experienced a long Terms of Service (TOS) document being read out. Voice interfaces and assistants (such as Amazon’s Alexa) were an initial focus for this thesis. Voice interfaces present many unique capabilities and limitations when compared to screenbased interfaces, that can be used to create unique dark patterns and collect new kinds of data. For example, they cannot display a large amount of information at once, for a user to take in at their own pace. This limitation allows for voice interfaces to withhold information that might be relevant or important to the user. For example, while purchasing items online, a voice assistant may present the user with items that create larger profits for businesses, and users are not able to see a balanced view of other options. To explore this concept, I created a voice-based prototype where users were prompted to use a weather app, and were presented with a Terms of Service document. This reading of the document took about 30 minutes, which would be uncomfortable and impractical to listen to. Figure 1 - Test participant experiencing a Terms of Service document over voice The prototype was demonstrated to a number of designers, who expressed a sense of unease and were unable and uninterested in listening to the entirety of the TOS document (Figure 1). The audience was also unable to focus on the contents of the document, and agreed to the terms without listening to them fully. This provocatype highlighted the impractical, manipulative and unethical nature of long-form TOS documents, by exploring how they would function in a different medium. It also highlighted issues of consent in a non-visual computing environment - informed consent can be difficult or impossible to obtain, as Terms of Service documents and privacy policies are presented in mediums that do not allow for large amounts of information to be presented in a readily accessible format. 3.2. Conversations with Researchers and Experts The creation of the voice TOS prototype spurred interest in unethical interaction design on other screenless devices and scenarios, where common manipulative or intrusive patterns found on screen-based devices today could emerge in a modified form. This led to a few key questions, about non-traditional interfaces, defining technology ethics, and possible solutions to ethics issues. To help answer these, I spoke to researchers and privacy/ ethics experts in 2020, including: • Jason Woywada, Executive Director at BC Freedom of Information and Privacy Association (Woywada, Interview) • Iohanna Nicenboim, UX Researcher (Nicenboim, Interview) • Harry Brignull, Expert Witness in dark patterns and creator of darkpatterns.org (Brignull, Interview) • Saul Greenberg, author of Dark Patterns in Proxemic Interactions (Greenberg, Interview) Conversations with these individuals occurred over video and phone calls, and helped answer the following questions: Outside of voice interfaces, what non-traditional interfaces could be a medium for unethical design? A starting point was to look at Nicenboim's provocative Internet of Things projects, mentioned earlier. These focused on IoT devices used at home, such as toasters or weight scales. A conversation with Brignull brought up Greenberg's 'Dark Patterns in Proxemic Interactions', which focused on the medium of interaction (proximity between people and devices/spaces) rather than specific devices. Further, a conversation with Greenberg brought up Ubiquitous Computing, scenarios of which could include any interfaces that were pervasive in environments, and that used sensors and passive mediums (such as vibration) to interact with users. Of many possible directions, Greenberg agreed that the automated retail space (such as Amazon's cashier-less stores) would be an ideal subject to research, as the commercial nature of the space provided many opportunities for monetization through unethical design. How can ethics and unethical design be defined and discussed in the context of a thesis? The subjective nature of ethics makes it difficult to define or determine whether an interface is 'ethical' or 'unethical'. A common theme that emerged when speaking to these experts was manipulation, or interfaces that manipulate users. Woywada highlighted the concept of 'identification and persuasion', a pattern used in marketing to persuade audiences to behave in certain ways by using specific personal information about them. Similarly, Nicenboim's projects highlighted 'experiments' that firms, software or devices run on their users, the findings of which are later used to monetize or create further engagement with these products. Brignull and Greenberg both agreed that labelling interfaces as 'unethical' is difficult and is open to debate. Many of the 'dark patterns' highlighted by Brignull, for example, cause momentary frustration, for example, while others cause significant financial harm. Using the term 'dark patterns' was preferable, as they simply referred to interfaces that could be unpleasant or harmful to users. What actions or directions can be taken to help solve or prevent issues with technology ethics? Nicenboim's methods involve creating fictional 'worlds', and provocative physical objects that are meant to exist in those worlds, for display and discussion by an audience. This 'design fiction' approach is helpful for exploring ethics in technologies that are not widespread or that do not yet exist. However, a point brought up in conversation was of the missing 'time dimension' the effects of manipulative technologies are often felt after users interact with them for weeks, for example. This time-based effect could not be reproduced in an object witnessed by an audience for a few moments. Woywada and Brignull both agreed on government regulation as the most effective path to resolve these issues. Woywada's organization, the BC Freedom of Information and Privacy Association, is one of various organizations and individuals that work towards building regulation around technology ethics. Regulatory acts such as the Personal Information Protection and Electronic Documents Act (PIPEDA) in Canada, and GDPR in the EU provide legal frameworks and opportunities for groups or individuals to bring about legal action against firms that benefit from manipulative/ intrusive technologies. Compared to other methods of resolving these issues, regulation stands out as its effects and outcomes can be clearly identified. These conversations helped make key decisions about the scope, direction and intended outcomes of this thesis. • Automated retail was selected as an initial direction to focus my research on, as it provides various opportunities for manipulation and monetization. • Going forward, this thesis specifically discusses privacy-invasive and behaviourally manipulative interfaces, as to avoid attempting to define ethics. However, 'ethics' and 'unethical technologies' are terms used in certain contexts, such as during discussions with workshop participants, where the subjectivity of these topics is useful in bringing up diverse ideas. • Government regulation was identified as an area to be supported, due to its identifiable benefit. The design and participatory work taken up during the thesis will ideally ultimately support the building of regulation by organizations and governments in the future. 3.3. Manipulative interfaces in a retail environment To explore manipulative interfaces outside of voice assistants, I explored ideas for smart environment technologies that were more physical. Ubiquitous Computing systems can involve the user's movement, position, and body language, and a provocative prototype that reflects these capabilities would be an effective alternative to a voice assistant app. One area with potential for physical, tangible UbiComp systems is in the retail space. It allows for regular, semi-public use by a range of audiences, with many opportunities for gathering data about these audiences and for creating personalized, monetized experiences. As of the writing of this article, there are over 25 Amazon-owned automated retail stores across the USA. These stores operate without cashiers or self-checkout stands, and customers are able to pick up items, place them in a bag of their choosing and leave the store. A phone scanner at the entrance identifies the customer as they enter the store, and links them to their Amazon account. Cameras placed throughout the store visualize their position and what they're picking up, and the system charges the customer’s Amazon account as they leave the store (Coldewey). This technology is available for license for other retailers, and similar systems are being developed by other parties (Vanian). The identification of users, and tracking of their position around the store allow for potential dark patterns in this space. On the internet, retailers and other commerce-related businesses commonly use dark patterns to sell more products more effectively. These include personalizing recommendations based on advertising profiles, or increasing sales using cognitive biases such as time scarcity, availability scarcity or aspirational purchasing (Brignull) These dark patterns can be applied and enhanced in a smart-environment setting. Greenberg et al mention a number of potential dark patterns that use the proxemic interactivity between users and Ubiquitous Computing systems. With the ‘Captive Audience’ pattern, systems make use of opportunities presented by a person inadvertently spending a certain amount of time in a particular area. Another pattern mentioned is the ‘Attention Grabber’, where systems attract the attention of passersby that do not have intentions of interacting with the system (Greenberg et al). These patterns can be used to create an engaging and persuasive experience for customers at a retail store. 3.3.1. Provocative prototypes for dark patterns in a retail store I created provocative prototypes for three dark patterns in retail spaces using smart price displays. The ‘Smart Price Display’ prototypes were created using a small screen and a proximity sensor, and displayed prices for items similar to price displays on shelves at a retail store. The prototypes use proximity to simulate the identification of a customer in the vicinity, and change the display according to their identity and behaviours. These prototypes were intended to speculate and provoke conversation on possibilities for intrusive/manipulative design patterns - including personalized advertising and persuasive messaging - in a retail space. Figure 2 - Smart Price Display Prototype 1 with recommended items based on what customers pick up The first prototype (Figure 2) recommends a related product as a customer is picking up an item. When a customer picks up an item (for example, bread) the display recommends a related item (such as eggs), using encouraging messages such as 'Goes well with’. Regular changes to the layout of a store is a common method of manipulating retail customer behaviour (Rupp). Rearranging a store layout can cause confusion and cause customers to notice and purchase items they normally would not notice. Using smart displays, a similar effect could be achieved without physically rearranging the store. Rather, the customer is encouraged to take new routes around the store to look for the recommended product. In this process, they pass by other products they normally would not notice, and pick those up. The display system can intelligently recommend products based on the layout of the store and the user's habitual patterns around the store, creating unique new and effective paths to follow. Figure 3 - Smart Price Display with recommendations based on what the customer already has The second prototype (Figure 3) recommends similar products based on what the customer has already picked up, or is thinking or planning to pick up. This encourages the customer to continuously and intentionally move through the store, without moments to pause and consider their browsing of the store. This concept, known as the ‘bottomless bowl’, is found in content-based digital products including Netflix, Instagram, and YouTube. The concept involves continuously giving users something to watch or interact with, without breaks for making decisions to stop, and can create an addiction to the product. The algorithmic selection of videos can lead users to unwanted, harmful and provocative content, such as extremist or pedophilic content, an issue identified in particular with YouTube (Chakrabarti) In a physical retail setting, this concept can increase the time spent at a store, and create addictions to purchases as the customer is always given something to look for. Algorithmic selection can lead to multiple unintended and unwanted product purchases - for example, an unhealthy product picked up can lead to a chained series of unhealthy products recommended and picked up. Figure 4 - Smart Price Display with personalized recommendations The third prototype (Figure 4) recommends products as you pass by, relying on the ‘Attention Grabber’ pattern to persuade customers into stopping and looking into the recommendation message. The display shows a highlighted banner with text saying that the item is specifically recommended for the current user. Recommended options and manipulative choice architecture is common in many digital services. Influence over a customer's decision making through choice architecture can limit people's agency and opportunities (Sunstein). Similar to the previous prototype, the Personal Recommendation prototype can cause purchases that can be appealing but unsuitable for people, and can have a negative impact on people's health and well-being. 3.3.2. Workshop 1 - Provoking to stimulate unethical concepts Beyond exploring ethics in the context of smart environments, an intention for this research was to explore possibilities for introducing the ideas of technology ethics and ubiquitous computing design to interaction designers. The three provocative prototypes were featured as part of an idea generation workshop. This was intended to help discover how designers could be introduced to the possibilities and potential negative effects of ubiquitous computing scenarios. Further, through the workshop, participants would generate ideas and contribute to a collection of examples of manipulative or intrusive scenarios. These workshops are intended to help designers become more ethical by making more informed, intuitively ethical decisions around interaction design. By working on unethical scenarios, inspired by provocations, participants can engage with these concepts in a low-risk, engaging experience. As designers are commonly familiar with Design Thinking, user-centred design, and building empathy with users - ideas frequently brought up in design education - their work can often be motivated by positive intentions, and negative possibilities are often overlooked. Through this workshop, participants can view ethics from an alternative perspective - by purposely designing scenarios with negative outcomes, participants can gain insights into how these outcomes and design patterns come to be, and be able to identify these possibilities in their work going forward. Figure 5 - Workshop with Smart Price Display prototype Four graduate students participated in the initial workshop (Figure 5). The prototypes were presented through an interactive demo, where participants tried to trigger the different advertisements on the price display. This prompted a discussion among the group about various manipulative interfaces and physical forms of manipulation they experienced. The group was prompted to think up similar ideas for manipulative systems in retail stores. The group created ideas such as price displays that highlight foods that suit a customer’s dietary needs, or that recommend products that are known to create addictions for individual customers. 3.4. Hostile Architecture as a space for manipulative interaction design The initial provocatype demonstrated the possibilities of UbiComp systems in smart environments to influence customer behaviour using visual messaging, with indirect and unintentional proximity-based interactions. Following the demonstration of this provocatype, many of the concepts created by workshop participants made use of displays and visual communication. This overlooks the possibilities for smart environments to feature more subtle and physical interactions with people, which can allow for more unique manipulative interactions. To explore this possible subtlety of UbiComp design, a display-less prototype was created and demonstrated, using the concept of hostile architecture in public spaces. Hostile architecture is public architecture designed to discourage its use in certain ways or by certain individuals. Common examples include benches or other seating areas shaped in ways to discourage sleeping or loitering, ledges with bolts placed on them to discourage skateboarding, or vents with spikes placed on them to discourage sleeping (Hu). In effect, hostile architecture can result in a comfortable experience for people using it in ‘desirable’ ways (such as sitting on a bench for a short amount of time) and an uncomfortable experience for people with ‘undesirable’ ways (such as sleeping on a bench overnight) In this case, persuasion is not used to change behaviours around product purchases or customer engagement, but to change behaviour in the use of public architecture and space. In interaction design, invisibility can hide mechanical processes and underlying computation, and remove a sense of understanding that a system that is capable of processing and storing data. In the previous Price Display provocatype, the cameras, proximity sensors, motion sensors required to identify customers and movements are invisible and passively triggered. Passively interactive ubiquitous systems have been used for ethically questionable use cases in examples from London (Moody) and the USA (Whittaker), where devices such as Stingrays (that collect data on the locations of individuals in public) are in use, and continues to be used in other places. Besides collecting data on people in public spaces, UbiComp systems can potentially be used to manipulate their behaviour in these spaces. Using concepts of hostile architecture, a provocatype of a subscription-based public bench was created. With the Subscription Collapsible Bench, a user is required to identify themselves by presenting a smart card from the local transit authority, from which the user has purchased a transit subscription. The available use of this bench can thus promote the sales of transit subscriptions. The bench can stay upright and comfortable as long as it detects the presence of a valid smart card. Through invisible collection of data and persuasive behaviour, the bench can be comfortable to transit subscribers. This avoids many of the surface-level issues with Hostile Architecture, such as the mild discomfort faced even by ‘desired’ users of the architecture. As the perceived discomfort is reduced, there exists the possibility for a deeper disconnect between the general public and the at-risk populations that might use public architecture in ‘undesirable’ ways. 3.4.1. Workshops 2 - Card Activities and Moments of Reflection Similar to the previous provocatype, the Bench was demonstrated and interacted with by participants at a series of workshops. These workshops included the addition of a structured, card-based activity with prompts and suggestions for participants. Supportive and suggestive cards are used in projects such as the Reimagining The Now project by Russel et al to help participants quickly learn about a number of new topics (Russel et al). One concern from the previous workshop was that the presented provocation (Price Displays) might limit participant’s ideas to the UbiComp systems demonstrated in the provocation. Figure 6 - Prompt cards used during workshops A number of cards were designed to help mitigate this issue (Figure 6). These include: • Context cards featuring public and semi-public spaces and contexts (such as a cafe, library or hotel) • Technology cards, featuring potentially ubiquitous technologies (such as gait-identifying cameras) • Persuasion cards, featuring a number of commonly used persuasion methods (such as limiting someone's available time to make a decision) The cards were tested in numerous iterations, as a support structure during workshops where participants generated ethically questionable scenarios. The revised version of the workshop was tested in five sessions, three of which involved individuals and two using groups of participants. Similar to the previous workshop, sessions began with a demonstration of the provocation (in this case, the Collapsible Bench) followed by discussions and participatory brainstorming. Pick a Context card, and imagine you're the owner/facilitator of that establishment. Behaviours to Encourage. If you're running this establishment, what behaviours can create profit or push your agenda? Context Card Here Minibar usage Minimizing Cleaning Ordering spa service Choose a behaviour to encourage or discourage, and pick a random Identifier card. Behaviour to Influence Minibar usage (Tech) Identifier Card Here Behaviours to Discourage. If you're running this establishment, what behaviours can create loss or discourage your agenda? Using towels Costs money Ease of access What can you identify in this context? How can you learn about the user? What's an ideal moment of interaction? Posted myself having a drink Dirtying the room Pick two Trick cards, and use one. How can we influence the customers' behaviour using this trick? What makes a user behave the way they do now? Pictures of delicious drinks Guest signup access Turning temperature of the room How much are they willing to pay? How much have they had to drink? What kind of long-term negative impact does your trick have? How can this show up in this new context? In what ways can this trick be used in this context? (Tech) Identifier Card Here Cut you off after the damagepoint Highlight that they could have a better time with a drink How can we use the ambience - the space, objects, and the specific moment in time - to put this trick to work? Turning temperature of the room (Tech) Identifier Card Here How much have they had to drink? Alcohol addictions Repeated visits to the hotel How can this be turned around to be an ethical situation? Cutting people off Asking guests for their limit Encourage them to watch their drinking Figure 7 - Example of workshop outcomes on a ‘game board’ Workshops were conducted both online and in-person. Online workshops did not allow for any participant interaction with the provocatype. Cards were presented over an online collaborative brainstorming application (Miro) and layed out on ‘game boards’ featuring prompts and questions as seen in figure (Figure 7) Figure 8 - In-person workshops with prototype interaction In-person sessions (Figure 8) allowed for participants to interact with the prototype and form a more embodied understanding of the system. 3.4.2. Reflecting and Flipping Ethics Participants in initial workshops left the workshop with a better sense of how Ubiquitous Computing systems and general interaction design could be unethical, and how designers such as themselves could create these unethical scenarios. A further addition to later workshops was the inclusion of an ‘ethics flip’, or a segment where participants reflected on the reasoning for why the scenarios they designed were unethical, and how they could be modified to perform similar functions without harming users. For example, one group worked with the context of a hotel, and conceptualized mini-fridges that offered alcohol at discounted prices based on the guests’ moods, persuading them to drink more alcohol. Further, they designed scenarios where alcohol consumption could be controlled according to a limit initially set by the guest, with messages encouraging the guest to be mindful of their consumption. 3.5. Collecting and Coding Scenarios Through the workshop process, a number of scenarios of manipulative public-facing technologies were created. Combined with real-world examples, these scenarios could then be classified and patterns identified. Broadly, two themes emerged: Technologies for 'appropriate' public behaviour These technologies are meant to prompt people to behave in ways that firms or governments consider 'appropriate'. The 'Smart Hostile Bench' is an example of this type of technology. The bench is, to many people, as convenient as a regular bench. However, the bench does not function for people trying to use it 'inappropriately' - people are not able to sleep on it, for example. A similar example is jaywalking street signs found in China (Lin) - cameras equipped with facial recognition identify jaywalkers and publicly shame them on display boards. Beyond actively controlling people's actions or shaming them, technologies might also subtly influence behaviour simply by surveilling them. Surveillance on its own can have a 'social cooling' effect, where people's conscious or subconscious awareness of being 'watched' prevents them from fully expressing themselves or taking risks (Schep). As instances of surveillance in public spaces increase, through facial-detection equipped cameras, license plate scanners, biometric identifiers and other identifiers, the risk of social cooling accordingly increases. Technologies to create engagement and addiction These technologies are common on web and mobile platforms today. Many scenarios found on Brignull's darkpatterns.org are designed to increase the time, attention and/or money a user spends on the website or app. In the context of smart environments, an example of engagement-inducing technologies is the use of facial recognition to promote or advertise. In a KFC restaurant in Beijing, customers see menu suggestions based on their estimated age, gender and mood as identified by facescanning systems (Lin). An example created during a workshop encouraged retail customers to use a store's loyalty points system, and subsequently purchase more items. Displays in store would show customers how many points they have left to collect towards a reward, and prompt them to buy a certain product to earn those points. The ethics of many of the scenarios found or created are subjective - for example, in many cases, the harm or inconvenience they cause to users could be considered negligible. Nevertheless, these technologies can generally be classified as ‘manipulative’ or ‘intrusive’ in nature, as they operate by identifying personal information, or modifying behaviours of the people that encounter them. 3.5.1 Website Figure 9 - Website for Persuasive Patterns in Smart Cities, with highlighted sections (bottom right and left) A website was created to display these scenarios and their themes and categories (Figure 9). Entitled 'Persuasive Patterns in Smart Cities', the website features information and visual examples of scenarios and patterns. These are categorized as above - as technologies for public appropriateness and for engagement and addiction. Figure 10 - Sample website illustration depicting a dark pattern in a smart environment Patterns are depicted using illustrations detailing a scenario. One example is that of ‘Running Low Trackers’, a pattern where a retail customer’s inventory of a product is tracked and used to advertise to them (Figure 10). Figure 11 - Add Your Own section An important feature of the website is an ‘Add Your Own’ section (Figure 11), where visitors can upload their own scenarios and patterns. This allows for the cyclical nature of this research project: the website provides provocations, and as people experience in-person provocations and take up workshops, they add the scenarios they generate to the website. Over time, this allows the website to grow to be a comprehensive reference for persuasive patterns and ethical interaction design in smart environments. Further possibilities include features that allow for community and conversation on the website, enabling the website to become a central point for discussion about design patterns in smart city technologies. 4. Analysis and Reflection This thesis featured a few primary methods: the creation of provocative prototypes, workshops where speculative scenarios are discussed and ideated, and a collection of examples of these scenarios. These elements came together to create a structure of support, where provocations helped run workshops, which provided scenarios for a collection. This provides a layer of intentionality and additional value to each method. Beyond researching privacy and manipulation in smart environments, this layered approach can be applied to research or design projects where participants are unfamiliar with the possibilities or technical capabilities of the medium they are designing for. 4.1. Effective workshop support through provocation Provocations or prototype demos at the beginning of workshops helped participants understand the possibilities of pervasive technologies in a public smart environment. Through interacting with the prototypes, the participants came to learn about how these systems can identify individuals, collect and analyze data from various sensors such as video or proximity, and trigger interactions with targeted passersby. This awareness of possibilities is vital to being able to think about interaction design within Ubiquitous Computing, and to think about the potential harmful consequences these technologies can lead to. Participant groups had to often be prompted to think of ideas for ‘unethical’ interaction design by assuming they are the proprietors of a retail store, and are seeking to create profit with malicious intent. This helped participants create ideas that were more directly harmful to the customer, including offers for products based on the customer’s insecurities or recent negative life events. The ‘invisible’ screen-less provocation and prompt cards had an influence on participant’s ideas for intrusive or manipulative interfaces. For example, in the second phase of workshops, one concept created featured vibrating wristbands in a fitness context as a screenless, minimally interactive persuader. Participants used persuasive techniques from the prompt cards in different contexts, as in the case of bookstores that advertise books related to each customer’s personal goals. 4.2. ’Ethics Flips’ at workshops helped identify potential guidelines or patterns for ethical technologies. In the ‘ethics flip’ segments of latter workshops, participants included certain elements in their scenarios that helped push the scenarios towards behaving more ethically. For example, in the example of hotel mini-fridges with user-defined limits and mindful messages, the elements of user consent, user agency, and a push towards mindful usage of technology helped create a scenario that benefits and supports the user. These elements were also part of ethics-flipped scenarios created in other workshop sessions. This indicates the potential for the creation of a set of generally applicable elements or patterns used to encourage or ensure more ethical and beneficial work in interaction design. 4.3. Demonstrating a provocative prototype when running workshops influenced the scenarios the participants created. The first workshop featured a provocative demonstration of a screen-based prototype (smart price displays.) Participants later ideated scenarios with most ideas featuring screens and visual manipulation. In the second workshop, participants witnessed an interaction with a provocative prototype that did not rely on a screen, but worked invisibly, using the physical environment. Participants ideated scenarios that were physical and ‘invisible’, using tactility (smartwatch vibrations) and environmental effects (such as increased temperatures) to manipulate user behaviours in their designed scenarios. This indicates that the demonstration of or interaction with prototypes has a significant influence on the thinking and work produced by workshop participants. This method of provocation-led workshops offers incentives to both creating provocations and to running workshops. It offers a set of constraints to designing provocations - primarily, that they must be quickly understandable, and they must demonstrate possibilities for similar prototypes. 4.4. Modifying the workshop format for audience experience levels Workshop participants had varying levels of familiarity with interaction design, technology ethics, and UbiComp, from being familiar with these topics to having to be introduced to them. This was distributed in participant groups, as group members could be collectively familiar with a topic between themselves. The workshops were designed to introduce UbiComp and ethics within that space, and was not designed differently for different audiences. Participants familiar with dark patterns expected more from the workshops than to discover how to use them in a smart-environments setting. Further, these participants would have preferred to not work with randomized cards while picking a technology or persuasive pattern to work with. They were familiar with some technologies or patterns, and it would be possible to use this familiarity to create more interesting scenarios, than to spend time learning about something they are unfamiliar with. 4.3. Desensitization Over the course of this thesis, I have read numerous papers, books and articles about technology ethics issues. The thesis also involved many conversations with other designers, researchers and peers about issues of ethics, privacy, behaviour manipulation, and bleak presents and futures. Conceptualizing workshops helped me personally experience, and observe with others, the creation of unethical interaction design scenarios. This involvement in learning about and discussing these disheartening issues has led me to become desensitized – with news stories about critical privacy issues becoming unremarkable, and my conversations about these issues becoming increasingly disengaged. As stories about ethics issues continue to emerge, the risk of desensitization grows in likeliness for many people with a moderate to non-existent interest in these issues. This complacency can be beneficial to firms that can continue to create privacy-intrusive and manipulative technologies, as seen with online platforms with significant privacy issues continuing to gain and retain users (Newton). However, the knowledge gained from being immersed in ethics has led to beneficial positive effects – for example, while working on social platform features for a client, I was able to identify the potential issues of social harm that could be caused by a 'like button' feature the client requested for, and encourage the client to rethink their request for this feature. While this is a relatively minor ethical diversion in the larger picture of widely used technologies, it indicates that awareness of these issues helped me identify a blind spot in my design process. It helped to be aware of potential harm my work could cause – and to build towards a 'critical consideration' aspect to this process. This indicates the potential for these workshops to have similar effects on others, as they spend time trying to understand smart environment technologies and intrusion/behaviour-manipulation techniques, and practice using them in unique scenarios. 4.4. Further and Alternative Directions The provocations and workshops in this thesis were designed for an audience of interaction designers. Designers have considerable influence in decisions made at technology firms, as the ‘contact point’ between firms, products and end users - making them ideal audiences for such workshops. Further, the workshop concept was validated by my acceptance to participate in 'What Can CHI Do About Dark Patterns', a workshop about dark patterns in various contexts, held at CHI 2021, a popular and significant conference on Human-Computer Interaction. While the series of workshops appeared to be engaging and educational to participants, it is important to note that all participants were design students or faculty at Emily Carr University, in Vancouver, Canada, while being from a diverse range of ages and gender and cultural backgrounds. Yet, given the subjective nature of ethics, and people's varying attitudes towards technologies, privacy and behaviour manipulation, the response to and outcomes of these workshops could change dramatically with different audience groups. In particular, the workshops could be conducted with non-designers, and participants in different countries, and may have to change in format or approach to be impactful to these groups. 4.4.1. Workshops for education Going forward, the context of design education would be ideal for running Dark Potentials workshop sessions. Through these sessions, students in interaction design can learn about, and freely explore unethical and ethical concepts in a guided context. The ‘ethics flip’ segment of the workshops, in particular, are useful for highlighting possibilities for using persuasive design techniques to achieve effective outcomes that benefit users. The workshops would enable students to develop a shared language around persuasion in smart environments, and build an intuitive sense for potentially harmful outcomes their work can cause. This can be useful in future situations where they are encouraged to design interactions with these potential outcomes. These workshops can be run regularly, as part of interaction design courses, particularly in classes focused on ubiquitous computing or smart environments. Ideally, these sessions would be run in groups of 4 to 6, to encourage effective design outcomes. Before these workshops, instructors or students can present provocative prototypes that students can interact with and learn about. Following workshop sessions, groups can present the scenarios generated. As the randomized card activities encourage the creation of unique scenarios, students can learn from a variety of creative scenarios from their peers. The Dark Potentials in Smart Cities website can be used by students as a reference before, during, and after the workshop sessions. Examining a number of scenarios with scenarios broken down would help up familiarity with the concepts of both smart environments and intrusion and persuasion techniques. As smart environments grow increasingly relevant, the Dark Potentials workshops serve dual purposes of familiarizing students with interaction design in this space while also helping them develop an ethical intuition. 4.4.2. Workshops for firms Further, an important audience to consider is firms working on Ubiquitous Computing and Smart Environment technologies, including Internet of Things projects. While many designers and product creators might be familiar with persuasive design techniques and how they apply to these technologies, the workshop can be useful for highlighting two important factors to consider while making design decisions in this field: • Long-term negative and harmful consequences that can be brought up by seemingly harmless interaction design. • Possibilities for outcomes that benefit users and avoid risk of harm, while meeting expected outcomes for firms. The standalone nature of the workshops, where participants are encouraged to think of new and unique scenarios (supported by randomized card activities) allow for participants to freely conceptualize scenarios and understand possible issues, without having to relate any ideas to projects they may be working on already. Thus, the workshops are focused on building shared language, ethics literacy, and ethical design intuition, which encourages ethical decisions being made in the long term, rather than short-term conflicts with ongoing work. 4.4.3. The general public audience A third audience to consider is the general public - people that do not work on designing or developing technologies. The importance of this audience lies in the power of large groups to bring about positive change around ethical technologies - through lawsuits and pressure on governments to implement laws, regulation and fines. For these audiences, the workshop concept can be modified significantly, to enable standalone, unguided sessions. An opportunity presented in this space is to create a more game-like activity, with possibilities for characters, game boards, rules and opportunities to ‘win’ by creating more unethical/ethical scenarios than other players. Similar to education and corporate audiences, benefits for general publics include the growth of shared knowledge and language around these issues. With general public audiences, an ethical intuition can help people identify potentially harmful outcomes in technologies they come across, which can help them make decisions around their usage and support for those technologies. 4.4.4. Website as a point of contact The possibilities for an unguided game-like card activity presents an opportunity for the Persuasive Patterns in Smart Cities website to grow indefinitely, with public contributions. Visitors to the website can download cards and instructions, and ‘play’ through their own sessions. The results of these sessions - scenarios generated, and patterns identified - can be added to the website. With large amounts of visitor-generated scenarios and patterns, visitors can contribute to identifying the most significant content - the most threatening and most beneficial scenarios, and the patterns that work best - through a ranking system. This ensures that the website displays relevant and impactful scenarios and patterns over time. 5. Discussion Through this thesis, a number of concerns and areas of interest came about. These include issues around informed consent in smart environments, the benefits of discerning between technologies that identify and persuade, and the potential for these identifiers and persuaders to be used in ethical ways that benefit the people and environments they exist in. 5.1. Consent in Pervasive, Invasive Environments Consent is an important part of ethically designed technologies. In 2014, the social network Facebook was revealed to have conducted mood manipulation experiments on its users without prior consent. Users’ news feeds were manipulated by data scientists to see more positive or negative content, a practice that was widely criticized when discovered (BBC). The GDPR requires websites to obtain user consent before using cookies to track people’s internet browsing behaviour. A study by Saric revealed that over 90% of website visitors did not consent to tracking cookies on websites when the website used a legally valid implementation of the GDPR consent request banners (Saric). In existing ubiquitous computing environments, users are assumed to have consented to the system’s privacy policies by their continued presence or usage of the environment. For example, a parking space that captures license plates via camera displays a signboard outside, informing potential customers about this video-based surveillance (NCP). This type of message is found in other public spaces, including buses and buildings. Customers or users of these spaces are not able to use the space without automatically consenting to interactions and/or surveillance. As UbiComp systems become lower in cost, and profit-generating interface patterns become commonplace, many physical businesses and spaces would adopt these systems and patterns. This would leave customers little choice while visiting a business or public space, as customers consent to their terms to be able to use their services. 5.2. Identifiers and Persuaders Technologies that abuse user privacy are commonly paired with persuasive technologies that use this data to create scenarios that benefit businesses or organizations. For example, Amazon collects data about how people are using their services, and offers personalized recommendations and various dark patterns to sell products online. Identifying the relationship between privacy-intrusive technologies and profit-generating persuasive technologies is key to fully understanding their effects and unethical possibilities. 5.2.1. Identification and Persuasion In ‘Politics the Wellstone Way’, Paul Wellstone describes the process of first identifying his potential voter base, and the subsequent process of persuading them to vote for him. “Targeting provides the likely precincts and areas where persuadable voters live in large numbers. But to actually have a targeted conversation with both base voters and persuadable voters, the campaign needs to specifically identify them. This is known as voter ID, and it is done in various ways. It starts by obtaining the list of all registered voters in the district.” (Wellstone Action) This two-step process of identification and persuasion can help target and influence people’s behaviours and prompt them to vote for certain politicians over others. In digital product design, this process can be carried out through the use of interface design patterns that can be called ‘Identifiers’ and ‘Persuaders’. Identifiers, such as tracking cookies, can help businesses collect data on a customer’s interests, their behaviour online, interactions with other people, and locations. Persuaders, that include many ‘dark patterns’, can influence customer’s behaviours, causing them to spend excessive amounts of time or money, or otherwise behave in ways that increase profits for businesses. The reliance of digital businesses on privacy-invasive behaviour-change business models often leads to harmful consequences. For example, YouTube identifies its users’ interests over time, by tracking the videos they watch and interact with. Persuasive features, like ’Autoplay’ and algorithmic recommendations, encourage users to watch videos of similar content for large amounts of time. This is valuable for YouTube as users spend more time watching advertisements, and build regular viewing habits. This results in users being placed in ‘filter bubbles’, which can lead to radicalizing, extremist content, sometimes provoking extremist action. This could result in people being addicted to extremist media and carrying out violence, voting for harmful governments, or unintentionally avoiding locations that are economically disadvantaged or racially diverse (Kaiser and Rauchfleisch). Intrusive and persuasive technologies can have far-reaching negative effects on people and societies. 5.2.2. Identifiers and Persuaders in Smart Environments In smart environments, Identifiers are commonplace and in many cases being legally evaluated or banned. Common applications include transit tap-to-pay systems (Winston) where a users’ behaviour and location is tracked over time. Facial recognition is used at airports (Street) and general public spaces, and is banned from use by police in some cities (Metz). A number of health insurance providers and governments encourage or provide incentives to persuade people to use fitness trackers and track their activities and health (Farr). Data from these trackers can be used to adjust insurance prices, for example. Location data is commonly tracked through smartphone apps and used to customize experiences or directly persuade users. For example, the navigation app Waze collaborated with McDonald’s to navigate users to McDonald’s locations when they drive past billboards for the restaurant (Burton). Identifiers are often ‘invisible’, both on personal computing environments and physical, shared ones. Greenberg describes this concept in ‘Dark Patterns in Proxemic Interactions’: “In day-to-day life, proximity is an ephemeral phenomenon. The proxemic relationship between parties dissolves as soon as they separate. In contrast, systems can tag any proxemic interactions as indicating a permanent, persistent (and un- desirable) relationship that is never forgotten.” As users interact with smart environments - such as tapping transit cards at a train station they are not necessarily aware that the systems they come into contact with are collecting and storing information about these interactions. Persuaders in smart environments can similarly be invisible. The Subscription Collapsible Bench prototype is an example of an interactive object that uses discomfort as a means to persuade people - to encourage transit pass purchases, and to discourage loitering and sleeping. For many people, this discomfort is non-existent or negligible, leaving the persuasive mechanism ‘invisible’. Another speculative example of invisible identification and persuasion is the use of automated cars by police forces. During a protest, police may use crowd control techniques such as ‘kettling’, where protestors are corralled by coordinated police officers moving in specific directions (Groundwater). This technique could be emulated by automated police vehicles that can intelligently identify protestors and routes to clear, and can work synchronously with other vehicles across entire cities. Identifiers and Persuaders thus work closely together to bring about behaviour manipulation or data collection. In many examples of technologies with poor ethics, Identifier patterns provide useful information required for Persuader patterns to work. Systems might also need to persuade users to share this information, through Persuader patterns. Across personal technologies and shared ‘smart environment’ technologies, understanding the relationship between Identifiers and Persuaders is important in understanding their ethics. 5.3. Light Patterns and Light Potential Dark patterns are often part of persuasive interfaces, as they prompt users to take actions or behave in undesirable or harmful ways (Brignull, Dark Patterns). This suggests the opposing existence of ‘light patterns’, or patterns that persuade people to behave in ways that are beneficial to them and others. For example, Twitter’s behaviour when a user re-tweets an article without reading it can be considered a ‘light pattern’. Twitter visually prompts the user to read the article before retweeting it, and claims that this feature is meant to “help promote informed discussion.” (Vincent). Unlike ‘bottomless bowl’-like dark patterns that aim to remove moments of reflection and decision-making, this pattern encourages the user to pause and make alternative decisions. Identifiers do not possess any ethics of their own, and can be used ethically or unethically. An Identifier can have both dark and light potential. An example of this is the Ring doorbell camera, used to protect buildings from intruders. While people can be uncomfortable and unsafe around public camera recordings in many scenarios, doorbell cameras can be acceptable due to their utility and specific scope of usage. However, this scope can expand to include concerning use cases. In 2019, Ring announced partnerships with over 400 police forces in the USA, offering access to footage from Ring cameras installed in buildings across the country (Paul). Experts in law and privacy claim that "the program could threaten civil liberties, turn residents into informants, and subject innocent people, including those who Ring users have flagged as “suspicious,” to greater surveillance and potential risk” (Osier). This demonstrates the potential for an Identifier to be used for intentional and beneficial reasons (safety and protection) and intrusive and potentially harmful use cases (offering video access to police forces). Doorbell cameras thus have ‘dark potential’ and ‘light potential’, as Identifiers. Identifying dark and light potential can be important for governments, designers, and the general public. While creating policy around an identifier or persuader, or using it in a design project, or simply evaluating its ethics, understanding their dark and light potential is key to pushing technologies towards ethical and beneficial outcomes. 5.4. Positive Developments Governmental policies such as the GDPR in the EU are a positive development in technology ethics. They reflect peoples' ability to collectively identify and handle ethics issues through substantial action. They also help prevent unethical design in technologies going forward. This form of preventative and controlling action continues to develop, as with the 2021 ban on privacy-intrusive dark patterns in California (Simonite) and several bans on police usage of facial recognition (Moon). Action from firms can also have significant positive effects on technology ethics. Recent updates to Apple’s iOS mobile platform require all apps to obtain user consent before tracking them. This feature was criticized by advertisement companies, claiming that it would cause a loss of advertising revenue for businesses (Statt, Facebook prompt will encourage ad tracking opt-in) This highlights the potential for firms to return choice and consent to users. It also highlights people’s awareness of privacy issues, as would be required for Apple to consider this a feature worth developing. Going forward, awareness and action on part of governments, firms and the general public is necessary to ensure ethical development of technologies in the future. 5.5. Future Directions This thesis results in the discovery of a methodology to bring designers and creative professionals to collectively speculate on unethical and ethical Ubiquitous Computing technologies. The provocation-led workshop format allows participants to better identify unethical possibilities for technologies around them and ones that they might be working on. It also allows for the discovery of generally applicable ethical patterns or guidelines, that can help push technologies towards more considerate and beneficial ideals. Going forward, the provocation-led workshops can be replicated infinitely, as each session results in the conceptualization of new provocations to be demonstrated at further workshops. Over time, this can come together in the form of workshops in education, corporate and public contexts, building an ever-growing library of scenarios with questionable ethics, and contributing to the building of a shared language around technology ethics. This will be beneficial to technologists, governing and regulatory bodies, and the general public in the identification and prevention of the development of unethical technologies. 6. Closing Remarks Protecting against the unethical use of technology is crucial to the wellbeing and development of human society. Digital technologies grow to be intertwined with our lives, as we use them to learn, work, play, interact with others and make decisions. How they interact with us affects us at all scales - as small as influencing single thoughts in individuals, to influencing social and political movements that change the world. As we turn increasingly - and often uncontrollably - reliant on these technologies, the risk of damage they pose grows ever larger. Smart environment technologies represent this uncontrollable pervasiveness to great extents. A privacy-intrusive face recognition system at the city scale is, for example, significantly difficult or potentially impossible to avoid. When combined with issues of public apathy and social/ occupational obligations, harmful ubiquitous technologies can spread widely without resistance. In the shadow of this risk, awareness and protective action grow vitally important. For designers, technologists and firms, an instinct to consider the harmful impact of their product must be brought about - through education and possible legal risk. Our future lies in the hands of actors with inconsiderate or malicious intentions, unless we stop to reflect and take control of how technology impacts our world. 7. References Anderson, Stephen P. Seductive Interaction Design: Creating Playful, Fun, and Effective User Experiences. 1st edition, New Riders, 2011. Ariely, Dan. Predictably Irrational, Revised and Expanded Edition: The Hidden Forces That Shape Our Decisions: Ariely, Dr. Dan: 9780061353246: Books - Amazon.Ca. https:// www.amazon.ca/Predictably-Irrational-Revised-Expanded-Decisions/dp/0061353248. Accessed 27 Apr. 2021. BBC. “Facebook Admits Failings over Emotion Manipulation Study.” BBC News, 3 Oct. 2014. www.bbc.com, https://www.bbc.com/news/technology-29475019. Behavioural Insights Team. Automatic Enrolment and Pensions: A Behavioural Success Story. https://www.bi.team/blogs/automatic-enrolment-and-pensions-a-behavioural-success-story/. Accessed 27 Apr. 2021. Bleecker, Julian. Design Fiction: A Short Essay on Design, Science, Fact and Fiction. Mar. 2009. Bowles, Cennydd. Future Ethics. NowNext Press, 2018. Breland, Ali. “Engineer Who Created Facebook ‘like’ Button Swears off Social Media Apps.” TheHill, 9 Oct. 2017, https://thehill.com/policy/technology/354574-the-engineer-that-created-thefacebook-like-now-limits-his-use-of-facebook. Brignull, Harry. Dark Patterns. https://darkpatterns.org/. Accessed 6 Dec. 2020. ---. Interview. 28 July 2020. ---. “Some Dark Patterns Now Illegal in UK – Interview with Heather Burns.” 90 Percent Of Everything, 26 Aug. 2014, https://www.90percentofeverything.com/2014/08/26/some-darkpatterns-now-illegal-in-uk-interview-with-heather-burns/. Buchenau, Marion, and Jane Suri. Experience Prototyping. 2000, pp. 424–33. ResearchGate, doi:10.1145/347642.347802. Burton, Monica. “Waze Is Watching You and It Knows You Want McRibs.” Eater, 19 Mar. 2019, https://www.eater.com/2019/3/19/18272694/waze-app-ads-steer-drivers-to-mcdonalds-mcribs. Calo, Ryan, and Alex Rosenblat. The Taking Economy: Uber, Information, and Power. SSRN Scholarly Paper, ID 2929643, Social Science Research Network, 9 Mar. 2017. papers.ssrn.com, doi:10.2139/ssrn.2929643. Canada, Office of the Privacy Commissioner of. The Personal Information Protection and Electronic Documents Act (PIPEDA). 4 Sept. 2019, https://www.priv.gc.ca/en/privacy-topics/ privacy-laws-in-canada/the-personal-information-protection-and-electronic-documents-actpipeda/. Carter, Adam, et al. “Sidewalk Labs Cancels Plan to Build High-Tech Neighbourhood in Toronto amid COVID-19 | CBC News.” CBC, 7 May 2020, https://www.cbc.ca/news/canada/toronto/ sidewalk-labs-cancels-project-1.5559370. Chakrabarti, Meghna. Autoplay On: YouTube Tolerated Toxic Content For Engagement, Report Says. https://www.wbur.org/onpoint/2019/04/08/youtube-toxic-content-extremism-bloombergsafety-internet. Accessed 11 Feb. 2021. Christl, Wolfie. Corporate Surveillance In Everyday Life. How Companies Collect, Combine, Analyze, Trade, and Use Personal Data on Billions. 2017, https://www.semanticscholar.org/ paper/Corporate-Surveillance-In-Everyday-Life.-How-Trade%2C-Christl/ 869d0da76cb4cd2d6bcb786f2ed2863acee10639. Coldewey, Devin. “Inside Amazon’s Surveillance-Powered, No-Checkout Convenience Store.” TechCrunch, https://social.techcrunch.com/2018/01/21/inside-amazons-surveillance-poweredno-checkout-convenience-store/. Accessed 11 Feb. 2021. Eyal, Nir. Hooked. https://www.goodreads.com/work/best_book/27477502-hooked-how-to-buildhabit-forming-products. Accessed 27 Apr. 2021. Farr, Christina. “Fitbit Wins Contract with Singapore to Supply Trackers to Potentially Hundreds of Thousands of Citizens.” CNBC, 21 Aug. 2019, https://www.cnbc.com/2019/08/21/fitbit-tosupply-trackers-to-hundreds-of-thousands-in-singapore.html. Forbrukerrådet. Report: Deceived by design : Forbrukerrådet. https://www.forbrukerradet.no/ undersokelse/no-undersokelsekategori/deceived-by-design/. Accessed 6 Dec. 2020. Friedersdorf, Conor. “YouTube Extremism and the Long Tail.” The Atlantic, 12 Mar. 2018, https:// www.theatlantic.com/politics/archive/2018/03/youtube-extremism-and-the-long-tail/555350/. Gewirtz, David. “A Year of Closing My Rings: How My Apple Watch Kept Me Moving All Year.” ZDNet, https://www.zdnet.com/article/a-year-of-closing-my-rings-how-my-apple-watch-kept-memoving-all-year/. Accessed 27 Apr. 2021. Gibbs, Samuel. “How to Turn off Google’s Location Tracking.” The Guardian, 14 Aug. 2018, http://www.theguardian.com/technology/2018/aug/14/how-to-turn-off-google-location-tracking. Gillespie, Eden. “Are You Being Scanned? How Facial Recognition Technology Follows You, Even as You Shop.” The Guardian, 24 Feb. 2019, http://www.theguardian.com/technology/2019/ feb/24/are-you-being-scanned-how-facial-recognition-technology-follows-you-even-as-you-shop. Greenberg, Saul, et al. “Dark Patterns in Proxemic Interactions: A Critical Perspective.” Proceedings of the 2014 Conference on Designing Interactive Systems, Association for Computing Machinery, 2014, pp. 523–32. ACM Digital Library, doi:10.1145/2598510.2598541. ---. Interview. 25 Sept. 2020. Greenfield, Adam. Everyware: The Dawning Age of Ubiquitous Computing. New Riders, 2006. Groundwater, Colin. “‘Kettling’ Is Supposed to Defuse Protests—Instead, It Does the Opposite.” GQ, https://www.gq.com/story/what-is-kettling. Accessed 17 Mar. 2021. Harris, Tristan. “How Technology Is Hijacking Your Mind — from a Former Insider.” Medium, 18 May 2016, https://medium.com/thrive-global/how-technology-hijacks-peoples-minds-from-amagician-and-google-s-design-ethicist-56d62ef5edf3#.omfybhlgs. Hawkins, Andrew J. “Robotaxis Get the Green Light for Paid Rides in California.” The Verge, 23 Nov. 2020, https://www.theverge.com/2020/11/23/21591045/california-robotaxi-paid-rides-cpucpermits. Hu, Winnie. ‘Hostile Architecture’: How Public Spaces Keep the Public Out - The New York Times. https://www.nytimes.com/2019/11/08/nyregion/hostile-architecture-nyc.html. Accessed 11 Feb. 2021. Kaiser, Jonas, and Adrian Rauchfleisch. “How YouTube Helps Form Homogeneous Online Communities.” Brookings, 23 Dec. 2020, https://www.brookings.edu/techstream/how-youtubehelps-form-homogeneous-online-communities/. Lin, Josh Chin and Liza. “China’s All-Seeing Surveillance State Is Reading Its Citizens’ Faces.” Wall Street Journal, 26 June 2017. www.wsj.com, https://www.wsj.com/articles/the-all-seeingsurveillance-state-feared-in-the-west-is-a-reality-in-china-1498493020. Lopatto, Elizabeth. “Christchurch Shooter Was Radicalized on YouTube, New Zealand Report Says.” The Verge, 8 Dec. 2020, https://www.theverge.com/2020/12/8/22162779/christchurchshooter-youtube-mosque-radicalized. Madrigal, Alexis C., and Ian Bogost. “How Facebook Works for Trump.” The Atlantic. The Atlantic, https://www.theatlantic.com/technology/archive/2020/04/how-facebooks-ad-technologyhelps-trump-win/606403/. Accessed 27 Apr. 2021. Mathur, Arunesh, et al. “Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites.” ArXiv:1907.07032 [Cs], July 2019. arXiv.org, doi:10.1145/3359183. Metz, Rachel. “Portland Passes Broadest Facial Recognition Ban in the US.” CNN, https:// www.cnn.com/2020/09/09/tech/portland-facial-recognition-ban/index.html. Accessed 11 Feb. 2021. Molla, Rani. “Americans Spent about 3.5 Hours per Day on Their Phones Last Year — a Number That Keeps Going up despite the ‘Time Well Spent’ Movement.” Vox, 6 Jan. 2020, https://www.vox.com/recode/2020/1/6/21048116/tech-companies-time-well-spent-mobile-phoneusage-data. Moody, Glyn. “Fake Mobile Phone Towers Discovered in London: Stingrays Come to the UK.” Ars Technica, 6 Oct. 2015, https://arstechnica.com/tech-policy/2015/06/fake-mobile-phonetowers-discovered-in-london-stingrays-come-to-the-uk/. Moon, Mariella. Massachusetts Lawmakers Pass State-Wide Police Ban on Facial Recognition. https://finance.yahoo.com/news/massachusetts-statewide-police-ban-facial-recognitionbill-064534909.html. Accessed 6 Dec. 2020. Moss, Emanuel, and Jacob Metcalf. “The Ethical Dilemma at the Heart of Big Tech Companies.” Harvard Business Review, Nov. 2019. hbr.org, https://hbr.org/2019/11/the-ethical-dilemma-atthe-heart-of-big-tech-companies. NCP. Our ANPR Ticketless Car Parks | NCP. https://www.ncp.co.uk/help-centre/payonexit/. Accessed 11 Feb. 2021. Near Future Laboratory. Helios: Pilote Quick Start Guide by Near Future Laboratory. http:// qsg.nearfuturelaboratory.com/. Accessed 11 Feb. 2021. Newton, Casey. “Facebook Usage and Revenue Continue to Grow as the Pandemic Rages On.” The Verge, 30 July 2020, https://www.theverge.com/2020/7/30/21348308/facebookearnings-q2-2020-pandemic-revenue-usage-growth. Nicenboim, Iohanna. Interview. 13 July 2020. ---. Objects of Research - Iohanna Nicenboim. https://iohanna.com/Objects-of-Research. Accessed 29 Mar. 2020. Nicks, Denver. “LinkedIn to Pay $13 Million in Spam Settlement.” Time, https://time.com/ 4062519/linkedn-spam-settlement/. Accessed 27 Apr. 2021. Osier, Valerie. Ring and LBPD Partner up, Giving Police Easier Access to Doorbell Camera Footage • Long Beach Post News. https://lbpost.com/news/crime/ring-doorbell-camera-securityprivacy. Accessed 11 Feb. 2021. Paul, Kari. “Amazon’s Doorbell Camera Ring Is Working with Police – and Controlling What They Say.” The Guardian, 30 Aug. 2019, http://www.theguardian.com/technology/2019/aug/29/ ring-amazon-police-partnership-social-media-neighbor. Pot, Justin. “Dark Patterns: When Companies Use Design to Manipulate You.” How-To Geek, https://www.howtogeek.com/363484/dark-patterns-when-companies-use-design-to-manipulateyou/. Accessed 27 Apr. 2021. Resnick, Brian. “Is Our Constant Use of Digital Technologies Affecting Our Brain Health? We Asked 11 Experts.” Vox, 28 Nov. 2018, https://www.vox.com/science-and-health/ 2018/11/28/18102745/cellphone-distraction-brain-health-screens-kids. Rupp, Rebecca. “Surviving the Sneaky Psychology of Supermarkets.” National Geographic, 14 June 2015, https://www.nationalgeographic.com/culture/food/the-plate/2015/06/15/surviving-thesneaky-psychology-of-supermarkets/. Russell, Gillian, et al. Reimagining the Now (in Development) - Garnet Hertz. http:// conceptlab.com/reimaginingthenow/. Accessed 11 Feb. 2021. Saric, Marko. “Only 9% of Visitors Give GDPR Consent to Be Tracked.” Marko Saric, 6 July 2020, https://markosaric.com/gdpr-consent/. Schep, Tijmen. Social Cooling - Big Data’s Unintended Side Effect. https:// www.socialcooling.com/index.html. Accessed 17 Mar. 2021. SCMP Reporters. Hong Kong Protests: Police Use Court Orders to Obtain Protesters’ Digital Fare Payment Details in Another Weekend of Petrol Bombs, Tear Gas and Fires on the Streets. https://sg.news.yahoo.com/hong-kong-protests-police-court-161146192.html. Accessed 17 Mar. 2021. Shieber, Jonathan. “As Payment and Surveillance Technologies Collide, Free Speech Could Be a Victim.” TechCrunch, https://social.techcrunch.com/2019/06/12/as-payment-and-surveillancetechnologies-collide-free-speech-could-be-a-victim/. Accessed 17 Mar. 2021. Simonite, Tom. Lawmakers Take Aim at Insidious Digital ‘Dark Patterns’ | WIRED. https:// www.wired.com/story/lawmakers-take-aim-insidious-digital-dark-patterns/. Accessed 18 Mar. 2021. Singer, Natasha. “When Websites Won’t Take No for an Answer.” The New York Times, 14 May 2016. NYTimes.com, https://www.nytimes.com/2016/05/15/technology/personaltech/whenwebsites-wont-take-no-for-an-answer.html. Solon, Olivia. “Facebook Says Cambridge Analytica May Have Gained 37m More Users’ Data.” The Guardian, 4 Apr. 2018, http://www.theguardian.com/technology/2018/apr/04/facebookcambridge-analytica-user-data-latest-more-than-thought. Statt, Nick. “Amazon Is Expanding Its Cashierless Go Model into a Full-Blown Grocery Store.” The Verge, 25 Feb. 2020, https://www.theverge.com/2020/2/25/21151021/amazon-go-grocerystore-expansion-open-seattle-cashier-less. ---. “Facebook Prompt Will Encourage Ad Tracking Opt-in Ahead of Apple’s Privacy Push.” The Verge, 1 Feb. 2021, https://www.theverge.com/2021/2/1/22260274/facebook-prompt-apple-iosad-tracking-opt-in-permission-privacy-update. Street, Francesca. “How Facial Recognition Is Taking over Airports.” CNN, https://www.cnn.com/ travel/article/airports-facial-recognition/index.html. Accessed 11 Feb. 2021. Sunstein, Cass R. The Ethics of Influence: Government in the Age of Behavioral Science. Cambridge University Press, 2016. Vanian, Jonathan. “Amazon Go’s Cashierless Stores Have a New Rival.” Fortune, https:// fortune.com/2019/12/20/amazon-gos-cashierless-stores-have-a-new-rival/. Accessed 17 Mar. 2021. Vincent, James. “Twitter Is Bringing Its ‘Read before You Retweet’ Prompt to All Users.” The Verge, 25 Sept. 2020, https://www.theverge.com/2020/9/25/21455635/twitter-read-before-youtweet-article-prompt-rolling-out-globally-soon. Weiser, Mark. “The Computer for the 21st Century.” ACM SIGMOBILE Mobile Computing and Communications Review, vol. 3, no. 3, July 1999, pp. 3–11. July 1999, doi:10.1145/329124.329126. Wellstone Action. “Politics the Wellstone Way.” University of Minnesota Press, https:// www.upress.umn.edu/book-division/books/politics-the-wellstone-way. Accessed 6 Dec. 2020. Whitney, Lance. “Creator of the Internet Pop-up Ad Apologizes for ‘Hated Tool.’” CNET, https:// www.cnet.com/news/creator-of-internet-pop-up-ad-apologizes-for-hated-tool/. Accessed 17 Mar. 2021. Whittaker, Zack. “ICE Used ‘Stingray’ Cell Phone Snooping Tech Hundreds of Times since 2017.” TechCrunch, https://social.techcrunch.com/2020/05/27/aclu-ice-stingray-documents/. Accessed 11 Feb. 2021. Winston, Ali. “The NYC Subway’s New Tap-to-Pay System Has a Hidden Cost — Rider Data.” The Verge, 16 Mar. 2020, https://www.theverge.com/2020/3/16/21175699/mta-omny-privacysecurity-smartphone-identifier-location-nyc. Wong, Julia Carrie. “How Facebook and YouTube Help Spread Anti-Vaxxer Propaganda.” The Guardian, 1 Feb. 2019, http://www.theguardian.com/media/2019/feb/01/facebook-youtube-antivaccination-misinformation-social-media. Woywada, Jason. Interview. 7 July 2020. Zuboff, Shoshana. “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization.” Journal of Information Technology, vol. 30, no. 1, Mar. 2015, p. 79. Springer Link, doi:10.1057/jit.2015.5.