The Need for Respectful Technologies: Going Beyond Privacy

The Need for Respectful Technologies: Going Beyond Privacy

Abstract: Digital technologies, the data they collect, and the ways in which that data is used increasingly effect our psychological, social, economic, medical, and safety-related wellbeing. While technology can be used to improve our wellbeing on all of these axes, it can also perpetrate harm. Prior research has focused near-exclusively on privacy as a primary harm. Yet, privacy is only one of many considerations that users have when adopting a technology. In this chapter, I use the case study of COVID-19 apps to argue that this reductionist view on technology harm has prevented effective adoption of beneficial technology. Further, a privacy-only focus risks perpetuating and magnifying existing technology-related inequities. To realize the potential of wellbeing technology we need to create technologies that are respectful not only of user privacy, but of users’ expectations for their technology use and the context in which that use takes place.

Digital technologies are increasingly intertwined with lived experiences of well-being. The ways in which we use technologies, and the ways in which technologies use our data, affect our psychological, social, economic, medical, and safety-related well-being. For example, being able to check-in on the wellbeing of others during natural disasters can bolster the strength of our physical-world communities and enhance our personal feelings of safety (Redmiles et al. 2019). In the health space, there is growing excitement and promising evidence for prescribing technologies to aid in the management of chronic illness (Byambasuren et al. 2018).

Despite their potential to improve our wellbeing, these same technologies can also perpetrate harm. Much of the dialogue regarding the technological harms of wellbeing technologies focuses specifically on data privacy risks: how the misuse of user data can create psychological, social, economic, or safety-related harms (Vitak et al. 2018, Redmiles et al. 2019).

Privacy has been shown to be a key, and growing, concern for users when considering whether to adopt new technologies, including well-being related technologies. However, privacy is far from the only consideration that effects whether a user will adopt a new technology. Here, I argue that we have developed a reductionist focus on privacy in considering whether people will adopt a new technology. This focus has prevented us from effectively achieving adoption of beneficial technologies, and risks perpetuating and magnifying inequities in technology access, use, and harms.

By focusing exclusively on data privacy, we fail to fully capture user’s desire for respectful technologies: systems that respect a user’s expectations for how their data will be used and a user’s expectations for how the system will influence their life and the contexts surrounding them. I argue that user’s decisions to adopt a new technology are driven by their perception of whether that technology will be respectful.

A large body of research shows that user’s technology-adoption behavior is often misaligned with their expressed privacy concerns. While this phenomena, the privacy paradox, is explained in part by the effect of many cognitive biases including endowment and ordering effects (Acquisti et al. 2013), it should perhaps not be such a surprise that people’s decision to adopt or reject a technology is based on more than just the privacy of that technology.

Privacy calculus theory (PCT) agrees, going beyond considering just privacy to also consider benefits, arguing that “individuals make choices in which they surrender a certain degree of privacy in exchange for outcomes that are perceived to be worth the risk of information disclosure” (Dinev and Hart 2006). However, as I illustrate below, placing privacy as the sole detractor from adopting a technology and outcomes (or benefits) on the other remains too reductionist to fully capture user behavior, especially in well-being-related settings.

The incompleteness of a privacy-only view toward designing respectful technologies was exemplified in the rush to create COVID-19 technologies. In late 2020 and early 2021, technology companies and researchers developed exposure notification applications that were designed to detect exposures to coronavirus and notify app users of these exposures. These apps were created to replace and/or augment manual contact tracing, which requires people to call those who have been exposed to trace their networks of contact.

In tandem with the push to design these technologies was a push to ensure that these designs were privacy preserving (Troncoso et al. 2020). While ensuring the privacy of these technologies was critically important for preventing government mis-use and human rights violations, and addressing user’s concerns, people rarely adopt technologies just because they are private (Abu-Salma et al. 2017). Indeed, after many of these apps were released, a minority of people adopted them. Missing from the conversation was a discussion of user’s other expectations for COVID-19 apps.

Privacy calculus theory posits that users trade off privacy against benefits and in so doing, make decisions about what technologies to adopt. However, empirical research on people’s adoption considerations for COVID-19 apps finds a more complex story (Li et al. 2020, Redmiles 2020, Simko et al. 2020). People consider not only the benefits of COVID-19 apps – whether the app can notify them of a COVID exposure, for example – but also how the efficacy of the app – how many exposures it can detect — might erode those benefits. Indeed, preliminary research shows that efficacy considerations may be far more important in user’s COVID-19 app adoption decisions than benefits considerations (Learning from the People: Responsibly Encouraging Adoption of Contact Tracing Apps 2020). On the other hand, privacy considerations are not the only potential detractors, people also consider costs of using the system both monetary- (e.g., cost of mobile data used by the app) and usability-related (e.g., erosion of phone battery life from using the app).

People’s adoption considerations for COVID-19 apps exemplifies the idea of respectful technologies: those that provide a benefit with a sufficient level of guarantee (efficacy) in exchange for using the user’s data – with the potential privacy risks resulting from such use – at an appropriate monetary and usability cost. While COVID-19 apps offered benefits, and protected user privacy, app developers and jurisdictions initially failed to evaluate the efficacy and cost of what they had built and failed to be transparent to users about both the efficacy and costs of these apps. As a result, people were unable to evaluate whether these technologies were respectful and the adoption rate of a technology that had the potential to significantly benefit individual and societal well-being during a global pandemic remained low.

Examining the full spectrum of people’s respectful technology-related considerations is especially critical for well-being-related applications for two reasons.

First, there are a multitude of types of well-being that are increasingly being addressed by technology – from natural disaster check-in solutions through mental health treatment systems – each with a corresponding variety of different harms, costs, and risks that users may consider. If we focus strictly on the privacy-benefit tradeoffs of such technologies, we may miss critical adoption considerations such as whether the user suspects they might be harassed while using, or for using, a particular technology (Redmiles et al. 2019). Failing to design for and examine these additional adoption considerations can be a significant barrier to increasing adoption of commercially-profitable and individually, or societally, beneficial technologies.

Second, different aspects of respectful technologies are prioritized by different sociodemographic groups (Learning from the People: Responsibly Encouraging Adoption of Contact Tracing Apps 2020). For example, older adults focus more on the costs of COVID-19 apps than do younger adults; younger adults focus more on the efficacy of these apps than do older adults. Ignoring considerations aside from privacy, and benefits, can perpetuate inequities in whose needs are designed for in well-being technologies and ultimately, who adopts those technologies. Such equity considerations are especially important for well-being technologies for which equitable access is critical and for which inequitable distribution of harms can be especially damaging. Thus, to ensure commercial-viability and adoption of well-being technologies, and to avoid perpetuating and magnifying well-being inequities through the creation of such technologies, it is critical to build respectful well-being technologies. Technology creators and researchers must not only consider the privacy risks and protections of such technologies – and the technology’s benefits – but also the contextual, cost, and efficacy considerations that together make up a potential user’s view of whether a well-being technology is respectful of them and their data. To do so, two approaches are necessary. First, direct measurement of the cost and efficacy of technologies produced, in line with expectations for evidence from other fields such as health (Burns et al. 2011). Second, direct inquiry with potential users to understand contextual and qualitative costs. By combining these two approaches to empirical measurement, we can better create wellbeing technologies that are both effective and respectful.

References

References

Abu-Salma, R., Sasse, M. A., Bonneau, J., Danilova, A., Naiakshina, A., and Smith, M. (2017). Obstacles to the Adoption of Secure Communication Tools. In: Security and Privacy (SP), 2017 IEEE Symposium on (SP17). IEEE Computer Society.

Acquisti, A., John, L. K., and Loewenstein, G. (2013). What Is Privacy Worth? The Journal of Legal Studies, 42 (2), 249–274.

Burns, P. B., Rohrich, R. J., and Chung, K. C. (2011). The levels of evidence and their role in evidence-based medicine. Plastic and reconstructive surgery, 128 (1), 305.

Byambasuren, O., Sanders, S., Beller, E., and Glasziou, P. (2018). Prescribable mHealth apps identified from an overview of systematic reviews. npj Digital Medicine, 1 (1), 1–12.

Dinev, T. and Hart, P. (2006). An Extended Privacy Calculus Model for E-Commerce Transactions. Information Systems Research, 17 (1), 61–80.

Learning from the People: Responsibly Encouraging Adoption of Contact Tracing Apps (2020). Available from: https://www.youtube.com/watch?v=my_Sm7C_Jt4&t=366s [Accessed 17 Mar 2021].

Li, T., Cobb, C., Jackie, Yang, Baviskar, S., Agarwal, Y., Li, B., Bauer, L., and Hong, J. I. (2020). What Makes People Install a COVID-19 Contact-Tracing App? Understanding the Influence of App Design and Individual Difference on Contact-Tracing App Adoption Intention. arXiv:2012.12415 [cs] [online]. Available from: http://arxiv.org/abs/2012.12415 [Accessed 17 Mar 2021].

Redmiles, E. M. (2020). User Concerns 8 Tradeoffs in Technology-facilitated COVID-19 Response. Digital Government: Research and Practice, 2 (1), 6:1-6:12.

Redmiles, E. M., Bodford, J., and Blackwell, L. (2019). “I Just Want to Feel Safe”: A Diary Study of Safety Perceptions on Social Media. Proceedings of the International AAAI Conference on Web and Social Media, 13, 405–416.

Simko, L., Chang, J. L., Jiang, M., Calo, R., Roesner, F., and Kohno, T. (2020). COVID-19 Contact Tracing and Privacy: A Longitudinal Study of Public Opinion. arXiv:2012.01553 [cs] [online]. Available from: http://arxiv.org/abs/2012.01553 [Accessed 17 Mar 2021].

Troncoso, C., Payer, M., Hubaux, J.-P., Salathé, M., Larus, J., Bugnion, E., Lueks, W., Stadler, T., Pyrgelis, A., Antonioli, D., Barman, L., Chatel, S., Paterson, K., Čapkun, S., Basin, D., Beutel, J., Jackson, D., Roeschlin, M., Leu, P., Preneel, B., Smart, N., Abidin, A., Gürses, S., Veale, M., Cremers, C., Backes, M., Tippenhauer, N. O., Binns, R., Cattuto, C., Barrat, A., Fiore, D., Barbosa, M., Oliveira, R., and Pereira, J. (2020). Decentralized Privacy-Preserving Proximity Tracing. arXiv:2005.12273 [cs] [online]. Available from: http://arxiv.org/abs/2005.12273 [Accessed 17 Mar 2021].

Vitak, J., Liao, Y., Kumar, P., Zimmer, M., and Kritikos, K. (2018). Privacy attitudes and data valuation among fitness tracker users. In: International Conference on Information. Springer, 229–239.