Consent

Being intentional about receiving consent before sharing data with AI builds user trust, provides liability protection, and supports ethical user experiences. AI-powered transcription is not a new invention of generative AI. However, now that models can be trained on our voices, and content can be generated out of our conversations, people are justifiably more concerned about the privacy of these tools.

Consent is necessary in three domains:

  • Personal data, such as whether conversations can be recorded or analyzed, and especially when the data may be used to train the model.
  • Organizational data, to avoid sharing proprietary information with third-party AI.
  • Other people's data, such as recording or training on voices, likenesses, or text that does not belong to the primary user.

The first two methods of granting support for data sharing are largely covered by the the data ownership pattern. Getting consent from others to use AI to record and possibly train on their data requires its own, more intentional approach.

Variations and forms

Very few products on the market proactively request consent and block recording or transcription if it is not given. Instead, this pattern takes multiple, potentially overlapping forms:

  • Opt-in disclosure: Users actively agree before the product records or uses data. Example: Limitless requiring audible consent from others before recording.
  • Silent by default: Products record or capture data without notifying others. Example: Granola captures a meeting transcript without storing recordings. The lack of voice recording is used to justify transcribing without notifying others as the default.
  • Post-hoc alerts: The product records, then alerts participants after the fact. Less common, but seen in transcription tools that notify attendees once transcription has started.
  • Consent for training: Explicit requests to use voices, likenesses, or user content in model fine-tuning. ElevenLabs sought permission from celebrity estates to avoid misuse.

Consent to develop someone’s likeness

It is increasingly easy to create fake voices and avatars of people based on recordings of their voice and person. ElevenLabs announced that they requested consent from the estates of celebrities like Burt Reynolds to clone their voices and feature them in the product.

ElevenLabs requested consent from the estate of celebrities like Burt Reynolds to clone their voices

Just as there are laws that keep our images from being used to suggest we endorse something without our permission, similar laws and patterns will be needed for voice cloning.

Consent and hardware devices

Meta made the decision to sell their AI glasses without any indicator that they are actively recording. There have already been public cases where this has been used to harass others by recording them and putting up recorded exchanges online.

Returning to the limitless device, the approach the company initially took offered a novel pattern to record seamlessly while protecting the privacy of others. The technology is there to support an ethical approach to designing these experiences, but it is up to the company and the designer to use it.

Design considerations

  • Make consent explicit and visible. Consent should never rely on buried text or ambiguous icons. Use clear, persistent indicators when recording, analyzing, or sharing data, and ensure it is visible from the surface where the interaction is occurring.
  • Default to opt-in, not opt-out. Silence is not consent. Ask for permission before capturing audio, video, or text interactions, even in products with prior authorization. Opt-in flows not only respect user autonomy but also simplify compliance with emerging AI and privacy regulations.
  • Differentiate consent types. Recording consent, model training consent, and sharing consent are distinct decisions. Users should be able to grant or withhold each independently. OpenAI and Anthropic both separate data-sharing choices from product usage, a good standard to follow.
  • Respect withdrawal at any point. Consent must be reversible. Provide an easy way to stop data collection or delete previously captured data, and clearly communicate what will and won’t be removed. Users gain trust when they see how revocation takes effect in real time.
  • Use multimodal indicators when screens aren’t visible. In voice, AR, or wearable contexts, consent cues must persist through light, sound, or vibration. Google’s Nest and Meta’s Ray-Ban smart glasses use LEDs and audio pings to confirm recording. While not explicitly required in all locations, these affordances result in a better consumer environment and brand value.
  • Design for shared environments. In meetings or group chats, all participants must be notified, not only the initiator. Zoom’s AI Companion automatically announces when AI features are active and requests acknowledgment from everyone before continuing.
  • Clarify downstream use. When user data might be used to train or improve models, disclose where it flows and how it’s anonymized or aggregated. Separate this choice from feature functionality, so declining model training doesn’t disable core product use.
  • Empower organizational and individual balance. For enterprise deployments, admins may predefine consent policies. Still, each user should receive clear, personal confirmation when AI recording or summarization is active. Organizational defaults can’t replace individual awareness.
  • Design for dignity, not just compliance. Beyond legal formality, consent conveys respect. When users see that your product asks clearly, listens, and responds to their decisions, it reinforces trust that goes deeper than any checkbox.

Examples

Gong sends a notification to participants ahead of time providing an option to join but request that the meeting is not recorded, implying consent if people join through the main link
The limitless pendant used to have the option to only record the voices of people who had audibly granted consent. This feature is no longer on by default
Notion shows a scrolling marquee when starting to transcribe notes, alerting the user of the requirement to explicitly request consent.