Morgan Klaus Scheuerman, Information Science PhD Student, University of Colorado Boulder

Facial Confirmation System: Enabling Users to Opt-In and Customize Information about Themselves in Facial Classification Technology

Imagining what user consent and autonomy looks like in commercial facial analysis applications that use facial detection and classification for personalized recommendation.



Facial analysis technologies, such as facial detection and facial recognition, have become a central domain of concern in the realms of privacy and technology bias. One specific concern is collecting sensitive, identifiable data about human beings (such as race, age, and gender), including at-risk and marginalized groups, and how that information might be used. Meanwhile, these types of technologies are becoming increasingly prevalent in the commercial realm, particularly in the context of personalized or targeted advertising and recommender systems. Users often have little control, or even knowledge, over when and how this data is collected, which data is collected, and how it is being used.


This design imagines an alternative to automated facial analysis technology that privileges user autonomy over algorithmic autonomy by allowing users to input and edit their own data. Facial confirmation technology reimagines the way users interact with their own, intimate facial data for the sake of commercial personalized recommendations. Instead of passively mining user data without consent, users can opt into their data being collected and essentially label their own data over time. Through a mobile or desktop application, users can create their own self-identified labels that act as a folksonomic schema—a user-generated label tree. Users can also revoke permissions to their facial data being databased by the system at any time.

Who is this for?

  • People who are…

    • Concerned about their privacy in commercial settings, like advertising and personalized recommender systems

    • Want control over self-identifying their own characteristics

    • Interested in fine-tuning their custom recommendations beyond what generic algorithmic predictions

Use case / Scenario:

Ade is a queer person of color. They have read a lot about facial analysis technologies in academia and the news that have concerned them—in particular, they have been concerned about the conversations around race and gender biases. They know this technology isn’t a part of their everyday life, but that it is becoming increasingly prevalent.

One day, Ade does encounter this technology. They are out shopping and they come across an advertising billboard in their favorite store. On the billboard, is a statement that the ads are powered by personalized facial detection, though what exactly that means is still a blackbox from the outside looking in. All Ade notices is that it gets it wrong. The advertisement they see is stereotypically feminine, and the skincare product it is advertising only shows white women using the product. They leave the store feeling frustrated that a machine saw them as a woman, while also erasing their identity as a person of color. They wonder what it would look like for these seemingly futuristic technologies to be more inclusive and forward-thinking.

The Facial Confirmation System (FCS) seeks to address concerns about demographic accuracy and data ownership. The goal of the FCS is to be customizable and scalable to numerous applications, like the advertising system Ade initially came into contact with. The system must first get user consent before automatically detecting, classifying, and databasing images of passersby and their associated labels. Ade has the ability to manage their images using an app. There, they can label their own images and customize their interests. Ade may also remove any images of themselves from the database at any time. User-inputted labels are decoupled from their images and create a web of more inclusive, comprehensive labels for training the system as a whole.

Related links: