If you ever stop to think about what your browsers and web platforms know about you, you might get spooked. We leave footprints of our preferences and interests everywhere we go.

Things get more complicated when we are interacting with AI, and this is magnified as AI agents begin to act like our digital twins, connecting to our personal information and products in order to act on our behalf.

The more we interact with AI, the better it will get to know us. Users benefit from this arrangement in the form of more personalized AIs that anticipate their needs, and more confidence that the AI can perform tasks we they intend. On the other hand, this greater degree of intimacy comes with an expectation of control. Just as users can clear their browser cache or adjust their ad preference settings, users will need some way of knowing what the AI knows about them, and possibly some way to reset that relationship altogether.

Knowledge map

Unless a user is interacting with the AI in a private session, it is constantly storing data about the user, their activity, and their preferences. Transparent products expose this map to the user to help them understand what traces of data they have left behind.

Selective memory

When a user can see what data the AI has captured, they may wish to selectively edit what data it retains. This might be for privacy reasons, or to focus the AI's knowledge in order to get more predictable outcomes. A product may also give the user the ability to save specific details to memory in order to construct to knowledge map to meet their preferences.

Clear the cache

While it's hopefully a rare occurrence, a user may occasionally wish to start from a clean slate. Relying on them to delete and restart their account is likely to result in attrition. Instead a product can support a reset option that fully clears the AI's memory fo the user and starts anew.

Details and variations

  • If the AI stores personal details about an individual or a company, allow them to see what it knows
  • Allows users to edit or clear the AI's memory
  • Consider that users may want or require different personas that they show the AI, and therefore may require the ability to segment their memories
  • Give users a way to browse incognito so memories are not stored

Considerations

Positives

Build trust

Consumer trust around AI is low. No matter how great your product is, this will likely drive down engagement. Stand out by offering this type of user control and you are likely to gain trust faster

Creative constraints

Giving users control over their memories can open up new creative opportunities, like letting users define different memory sets for different purposes

Concerns

Paid privacy

Avoid limiting the ability to see and control the AI's memory to paid accounts. All users deserve this control.

Fake privacy

Google recently lost a class action lawsuit from people who browsed in incognito mode without realizing some of their data was still being captured. When users are browsing in incognito mode, they expect full privacy.

Use when:
Users want to see the personal data the AI has stored about them and manage it to meet their preferences.