AI Girlfriends: Data Sharing and Sales Practices

Last updated: February 2, 2026

Some links are affiliate links. If you shop through them, I earn coffee money—your price stays the same.
Opinions are still 100% mine.

A woman's face illuminated by red laser lines, symbolizing data collection and digital identity.
Your digital identity is constantly being scanned.

Hello, I'm Tom. Similar to everyone else, I've become interested in the large number of AI companions currently available. The ability to talk all day and feel understood is compelling—but it all runs on your data, and that has real privacy stakes.

As I explored these services, I kept asking where every chat, voice note, or selfie ends up. The answer is not simple, and it is more disturbing for users than they may realize. This article merges our prior reporting to show exactly how AI companion companies collect, share, and sell your data—and what you can do about it.

What Is an AI Girlfriend?

An AI girlfriend is more than a time-consuming chatbot; it is an advanced application that uses natural language processing to simulate a romantic partner. She remembers past conversations, builds a digital profile of your likes and vulnerabilities, and adjusts her personality based on what you share. The industry has exploded from a handful of apps in 2022 to dozens by late 2025, which means more competition for your attention—and for your data.

That scale makes transparency critical. Providers promise intimacy and emotional support, but the same data that makes the AI feel close to you can be packaged for advertisers or data brokers. If you want a deeper dive into the ecosystem, see our AI girlfriend landscape overview for market context.

The Allure of Digital Love

Before diving into the risks, there are genuine upsides. AI companions can reduce loneliness, provide judgment-free support, and offer a safe space to practice conversation. Here are some of the most significant advantages, along with how they can benefit your life:

Key Benefits of AI Companionship
AdvantageHow It Benefits You
Reducing LonelinessThere will always be someone available to provide you with support when you need it.
Emotional SupportA place for empathy and sharing without any fear of judgment.
Improved Social SkillsA safe space to practice conversations and build your confidence.
Mental Health SupportProviding coping strategies or a place to express your feelings out loud.

Yet this closeness is powered by data. The more you share, the more the system learns—and the more valuable your profile becomes to the company and its partners.

An Investigation: Who is listening?

A futuristic woman with a bionic arm, representing the intersection of humanity and AI technology.
AI companions blend human connection with complex technology.

My research started with the privacy policies we all skip. Backed by Mozilla Foundation’s findings, every one of 11 reviewed AI companion apps received a "Privacy Not Included" label. Here is what is happening to your data:

  • Massive amounts of data collected: Apps often collect more than chats—device IDs, IPs, usage analytics, plus prompts for sensitive health or belief data.
  • Sharing or selling are common: Roughly 80% of apps share or sell data. They may not sell raw chats, but they do sell or share profiles and technical data with “affiliates,” “service providers,” and “advertising partners.”
  • Flawed security: Many apps lack strong encryption, leaving private discussions vulnerable to employee access or breaches.

To make this concrete, here is a snapshot of well-known platforms:

Data Practices of Popular AI Companion Apps
AppMy Findings Regarding Their Data Practices
ReplikaStates in its policy that it won’t transfer the contents of chats for advertisement but may give or sell additional personal data for targeted advertising. Anonymized chat data is also utilized to improve AI.
Character.AITracks an extensive variety of data including audio recordings and chat contents. Their policy states that they will share both of these types of data with advertisers, analysts and partners for serving advertisers and for training AI.
Candy.aiClaims to not sell or trade data without consent. However, they use third-party cookies from partners such as Google and Meta for their ad services, which means they have essentially shared data to provide advertisers with opportunities to target users.
HeraHavenClaims they have confidentiality and encryption of data. However, their policy states that they "routinely share personal data with service providers" in order for them to operate their business.

One real-world example: Soulmate AI abruptly shut down in 2023, leaving sensitive user data in limbo.

CPRA/CCPA: legal definition of “sale” vs “sharing”

Under the CPRA/CCPA, a “sale” is any transfer of personal information for money or “other valuable consideration,” while “sharing” covers disclosure for cross-context behavioral advertising even when no cash changes hands. That means device IDs, inferences from your chats, and ad identifiers can all count as sales or sharing, even if the company insists it never sells your “messages.”

AI companion apps frequently rely on broad “service provider” language to justify these transfers. If the recipient can use the data for its own purposes (e.g., training models or ad targeting across sites), it is far more likely to be a sale or sharing event under state law.

"Do Not Sell or Share" opt-out instructions

California users must be offered a clear opt-out. Look for a footer link labeled “Do Not Sell or Share My Personal Information” or a toggle inside account settings. When available:

  1. Open the link and submit your request with the email you used to register.
  2. Turn off cross-context behavioral advertising and disable “personalization” or “ad tracking” toggles.
  3. Request that your profile not be used for model training if the control exists.
  4. Document your request via screenshot; if ignored, you can escalate to the California AG.

Global Privacy Control (GPC) handling

The CPRA requires businesses to honor a Global Privacy Control signal. Enable GPC in browsers like Firefox or Brave, or through extensions such as DuckDuckGo. If the app also runs on the web, the signal should automatically trigger a “do not sell/share” preference. Companies that ignore GPC risk enforcement, so keep the signal on and note any confirmation banners the site provides.

Third-party data brokers and advertising partners

Many AI companion apps route telemetry to analytics SDKs and ad networks, effectively feeding data brokers who enrich profiles with inferred interests, location, and device fingerprints. Privacy policies often describe this as working with “service providers” or “partners,” but if those partners reuse the data to build audiences elsewhere, it is functionally a sale or sharing event. Check each policy’s list of third parties, and consider using system-level tracking limits to blunt broker access.

Children’s data and age verification

Children’s data is especially sensitive. Under U.S. law (COPPA) and under the CPRA’s heightened protections for minors, selling or sharing data from users under 16 generally requires opt-in. Many apps rely on a simple age gate, which is easy to bypass. If you are a parent, use OS-level parental controls, deny camera/microphone permissions, and submit deletion requests on behalf of minors. GDPR adds stricter consent rules and may require a guardian’s sign-off depending on the user’s country.

Your Privacy Game Plan: How to Have a Safer Experience

A woman's face covered in projected computer code, illustrating the vast amount of data being processed.
Every interaction you have is a data point.

Based on all of the information that I have researched, below is my checklist for having a safer experience using these digital applications. If you are considering using an AI girlfriend or boyfriend, use this framework—and read the app’s Privacy Policy or our own privacy policy for comparison.

My 4-Step Application Safety Checklist

  1. Read the Privacy Policy (The Smart Way): Don’t just scroll. Use “Ctrl+F” or “Find” and search for keywords like “share,” “sell,” “advertisers,” “third party,” and “affiliates.” If the language is ambiguous or they grant themselves broad authority, that's a major warning sign.
  2. Look for Security Fundamentals: Search for the word “encryption.” Your private conversations should be encrypted. If the app allows weak passwords (e.g., “123456”), it indicates they aren't taking security seriously.
  3. Review App Permissions: Be critical about what permissions you grant. Does a chatbot really need access to your contacts or precise location? Go into your phone settings and revoke any permissions that aren't absolutely required.
  4. Look for Independent Reviews: Before downloading, search online for "[App Name] + privacy review" or "security." Read what technology and privacy advocates are saying.

Golden Rules of Chatting

  • The Public Postcard Rule: This is my most essential rule. Always communicate as if you were writing on a postcard. Do not provide your full name, home address, banking information, workplace, or graphic photos. For more tips, read our guide on how to limit what your AI knows.
  • Use a Temporary Identity: Create a separate email address just for these apps. Don't use your real name or link any of your main social media accounts.
  • Utilize App Privacy Controls: Dig into the app's settings. Look for options to opt out of data sharing for AI training or advertising. If available, regularly delete your chat history.

Frequently Asked Questions about AI Girlfriend Privacy

What type of information do AI girlfriend apps track?

They may track anything you provide them, including chat history, images, audio, and other personal information, plus technical data such as IP address, device type, and usage frequency.

Are AI girlfriend apps providing customer information to third parties?

Are the interactions I have with an AI girlfriend application private?

What are AI girlfriends?

Are AI girlfriend apps selling my personal information?

How often do AI companion apps share my data?

What are the main privacy risks of using AI companions?

How can I protect my data when using an AI girlfriend app?

Why do AI girlfriend apps collect so much data?

What type of information do AI girlfriend apps share with advertisers?

What happens to my data if the AI girlfriend company goes out of business?

How do I delete my data from an AI girlfriend application?

The Future of Digital Intimacy

A glowing network over a city, symbolizing the future of digital connectivity and data sharing.
The future of digital relationships depends on trust.

AI relationships aren’t going away; they fill a legitimate need in society. Moving forward, the foundation must be trust and transparency: clear opt-outs, strong encryption, short retention windows, and the right to delete. Until then, approach these apps with eyes wide open. An AI companion can offer comfort, but when an app is free, you are often the product.