Proceedings of the 1st International Conference on AI for People: Towards Sustainable AI, CAIP 2021, 20-24 November 2021, Bologna, Italy

Research Article

Informed Digital Consent for Use of AI Systems Grounded in a Model of Sexual Consent

Download631 downloads
  • @INPROCEEDINGS{10.4108/eai.20-11-2021.2314136,
        author={Emmie  Hine},
        title={ Informed Digital Consent for Use of AI Systems Grounded in a Model of Sexual Consent},
        proceedings={Proceedings of the 1st International Conference on AI for People: Towards Sustainable AI, CAIP 2021, 20-24 November 2021, Bologna, Italy},
        publisher={EAI},
        proceedings_a={CAIP},
        year={2021},
        month={12},
        keywords={consent artificial intelligence tolerant paternalism infosphere autonomy},
        doi={10.4108/eai.20-11-2021.2314136}
    }
    
  • Emmie Hine
    Year: 2021
    Informed Digital Consent for Use of AI Systems Grounded in a Model of Sexual Consent
    CAIP
    EAI
    DOI: 10.4108/eai.20-11-2021.2314136
Emmie Hine1,*
  • 1: Oxford Internet Institute, University of Oxford, U.K.
*Contact email: emmie.e.hine@gmail.com

Abstract

Artificial intelligence (AI) systems shape our infospheres, mediating our interactions and defining what information we have access to. This poses a tremendous threat to individual autonomy and impacts society, both online and offline. Users are often unaware of the potential impacts of using these systems, and companies that utilise them are not incentivised to adequately inform their users of those impacts. Forms of digital design ethics, including pro-ethical design and tolerant paternalism, have been proposed to help protect user autonomy, but are not sufficient to ensure that users are educated enough to make informed decisions. In this paper, I use sexual consent as defined by American universities to outline and propose ways to implement a model of “informed digital consent” that would ensure that users are well-informed so that their autonomy is not only respected, but enhanced.