TL;DR: OpenAI launched ChatGPT Health in the US on January 7, 2025, but excluded the European Economic Area, Switzerland, and the UK. This exclusion signals the product does not meet GDPR and AVG data protection standards. Health data uploaded to ChatGPT Health loses HIPAA protection, creates aggregation risk, and operates under weaker US privacy rules. Expat entrepreneurs in the Netherlands should avoid uploading sensitive health data to platforms outside EU jurisdiction.
What You Need to Know About ChatGPT Health and Data Protection
- ChatGPT Health does not launch in the Netherlands or EU because the product does not meet GDPR and AVG data protection standards for health data.
- Health records uploaded to ChatGPT Health lose HIPAA protection and fall under OpenAI’s terms of service, not federal health privacy law.
- Data aggregation on the platform creates high-value breach targets. Healthcare breaches in 2025 averaged 71,276 records per incident.
- Third-party apps connected to ChatGPT Health operate under their own privacy policies. Disconnecting apps does not delete data already collected.
- Expat entrepreneurs in the Netherlands should not upload sensitive health data to platforms that do not comply with GDPR.
Why Is ChatGPT Health Not Available in the Netherlands?
OpenAI announced ChatGPT Health on January 7, 2025. The product promises to bring your medical records and AI intelligence together in one place. Over 230 million people already ask ChatGPT health questions every week.
The product will not launch in the Netherlands. It will not launch anywhere in the European Economic Area, Switzerland, or the UK.
This is not an oversight. This is a signal.
I run a publication focused on governance, compliance, and digital risk for expat entrepreneurs in the Netherlands. When I see a major tech company avoid the EU while launching in the US, I pay attention. The gap between those two regulatory environments tells you what’s missing from the product.
In this case, what’s missing is structural data protection that meets European standards.
Bottom Line: OpenAI’s decision to exclude the EU market indicates ChatGPT Health does not meet GDPR and AVG requirements for processing health data.
How Does ChatGPT Health Work?
ChatGPT Health allows you to upload your medical records. You connect the platform to third-party apps like Apple Health or other wellness tools. You ask health questions. The AI responds using your data.
OpenAI partners with a company called b.well to access your medical records. The data flows through what OpenAI calls “purpose-built encryption and isolation.” The company says your health conversations stay compartmentalized and will not train their foundation models.
The Data Flow Structure
You upload records → b.well retrieves them → OpenAI processes them → Third-party apps access them if you opt in
Each step introduces a new entity that handles your data. Each entity operates under its own privacy policy.
When you connect a third-party app, that data is governed by the third party’s rules, not OpenAI’s.
The complexity multiplies. The control fragments.
Key Insight: Multiple entities in the data processing chain mean multiple privacy policies and fragmented control over your health information.
Why Do GDPR and AVG Matter for ChatGPT Health?
In the Netherlands, health data falls under the strictest category of personal data protection. The Algemene Verordening Gegevensbescherming (AVG), the Dutch implementation of GDPR, treats health information as “special category data.”
What AVG Requires for Health Data
Processing health data requires:
- Explicit consent
- Clear purpose limitation
- Demonstrable data minimization
- Proof of where data goes, who processes it, and how long it’s retained
When multiple parties process your data, each one becomes a data controller or processor under the AVG. This creates a chain of accountability. If something breaks, the Autoriteit Persoonsgegevens investigates every link.
How US Data Protection Differs
In the US, the picture is different.
When you upload your medical records to ChatGPT Health, those records lose HIPAA protection. Privacy experts note that “no federal regulatory body governs the health information provided to AI chatbots, and ChatGPT provides technology services that are not within the scope of HIPAA.”
If a breach happens, you have no specific rights under HIPAA. You operate under OpenAI’s terms of service, terms that change at any time.
Franco Giandana Gigena, policy analyst at Access Now, told reporters that ChatGPT Health’s absence from the EEA, Switzerland, and the UK is “concerning regarding the level of minimization, purpose limitation and overall protection the system actually offers.”
Translation: the product does not meet European data protection standards.
Core Difference: EU health data protection provides structural accountability through GDPR and AVG. US platforms operate under voluntary terms of service without comprehensive federal privacy law.
What Are the Risks of Health Data Aggregation?
I’ve spent years helping small companies in the Netherlands build data governance structures. One principle remains constant: aggregation increases exposure.
When you centralize sensitive data in one platform, you create a high-value target.
Healthcare Breach Statistics
- Healthcare data breaches in 2025 averaged 71,276 records per incident
- The healthcare sector experienced 605 breaches affecting 44.3 million Americans
- Healthcare remains the costliest industry for data breaches for the 14th consecutive year
- Average breach costs hit €9.4 million per incident
Security experts warn that any system aggregating medical records, wellness data, and AI-generated health insights on a single platform significantly increases the data exposed in a breach.
What This Means for Expat Entrepreneurs
If you’re building a digital health product in the Netherlands: Data protection is a structural requirement, not a marketing feature. Competitors who ignore GDPR will not survive contact with the Autoriteit Persoonsgegevens.
If you’re managing your own health data: Uploading sensitive information to platforms outside EU jurisdiction removes your legal protections. Once data crosses the Atlantic, Dutch and EU data subject rights become theoretical.
Reality Check: Health data aggregation creates high-value breach targets. The Netherlands requires structural data protection because aggregation increases exposure.
What Should Founders in the Netherlands Know About ChatGPT Health?
I’m not writing this to create panic about AI in healthcare. AI tools will transform how we access health information. The question is whether this transformation happens with structural data protection or without it.
If you’re an expat entrepreneur in the Netherlands running a micro or small business, here’s what matters:
Jurisdiction Determines Your Protections
When you use a US-based platform, you operate under US data protection rules. Those rules are weaker than what the AVG provides.
If you upload client data, employee health information, or your own medical records to a platform that does not comply with GDPR, you create exposure.
Encryption Types Matter
ChatGPT Health mentions encryption “at rest and in transit.” This is standard.
End-to-end encryption means only you decrypt your data. The platform provider cannot access it.
OpenAI has not confirmed end-to-end encryption for ChatGPT Health. Therefore, OpenAI can access your health data.
Consent Alone Does Not Protect You
Under the AVG, consent is one lawful basis for processing data. Even with consent, you must demonstrate purpose limitation, data minimization, and accountability.
Giandana Gigena notes that “in health-related contexts, sometimes consent may not always be sufficient to protect data in complex webs of data sharing.”
Terms of Service Change Without Warning
Because there’s no comprehensive federal privacy law in the US, ChatGPT is only bound by its own disclosures. The company changes terms at any time.
This echoes concerns raised during the 23andMe bankruptcy, where users’ genetic data became part of the company’s assets.
Uploaded Data Is No Longer Private
Dr. David Bitterman recommends “the most conservative approach”: assume any information you upload will no longer be private.
Even with layered protections, the specifics of how ChatGPT Health protects data remain unclear.
Operating Rule: US-based platforms operate under weaker data protection rules than AVG. Jurisdiction determines your legal protections when breaches or disputes occur.
How Do Third-Party Apps Access Your ChatGPT Health Data?
One of the most concerning aspects of ChatGPT Health is how third-party apps access your data.
When you connect a wellness app or health tracker to ChatGPT Health, that app accesses your medical records. OpenAI says all apps must meet privacy and security requirements and collect only minimum necessary data. Users disconnect apps at any time.
Disconnecting Does Not Mean Deleting
Disconnecting an app does not delete the data it already collected. It only prevents future sharing.
Giandana Gigena explains: “Once data has been shared, it is almost impossible to completely and entirely delete it, and it is often indicated that this means that data control is lost.”
Right to Erasure Under AVG
Under the AVG, you have the right to erasure, the “right to be forgotten.” This right applies to all processors handling your data.
In practice, enforcing that right across multiple US-based companies operating outside EU jurisdiction becomes difficult.
Where Breaches Actually Happen
The American Hospital Association’s 2025 cybersecurity review found:
- Over 90% of hacked health records were stolen outside the electronic health record system
- 100% of hacked data was either not encrypted or stolen via compromised credentials
The vulnerability is not always in the primary platform. It’s in the connected systems.
Control Loss: Third-party app connections create permanent data exposure. Disconnecting apps stops future sharing but does not delete data already collected.
What Questions Should You Ask About Digital Health Tools?
If you run a small business in the Netherlands, you already know the Autoriteit Persoonsgegevens takes data protection seriously. Fines under the AVG reach €20 million or 4% of global annual turnover, whichever is higher.
Compliance is not about avoiding fines. It’s about maintaining client trust and operational control.
When you evaluate digital tools for your business, especially tools that handle health data, employee information, or client records, ask these questions:
Five Critical Questions for Digital Tool Evaluation
Where is the data stored?
EU-based servers provide stronger legal protections than US-based storage.
Who accesses the data?
Understand every entity in the data processing chain.
What happens when you delete data?
Prove deletion occurred across all processors.
Does the provider comply with GDPR?
If a major tech company avoids the EU market, this tells you something about their compliance readiness.
What happens if the company changes ownership or goes bankrupt?
Your data becomes an asset in those situations.
Due Diligence Standard: Evaluate digital tools based on data storage location, access control, deletion proof, GDPR compliance, and ownership change scenarios.
What Remains Unclear About ChatGPT Health Privacy?
OpenAI has not provided clarity on several critical issues.
Law Enforcement Data Sharing
The company has not explained the processes it will use to determine when to share health data with law enforcement. Privacy experts ask: “Do they just turn over the information? Is the user in any way informed?”
This matters especially for:
- Reproductive health information
- Immigration-related health data
- Any health information used in legal proceedings
Advertising and User Profiling
OpenAI indicated it’s exploring advertising as a business model. While the company claims health data will not be used in other chats, privacy advocates want more clarity ensuring that health data or insights learned from it will not be used to profile users.
Transparency Gap: OpenAI has not clarified law enforcement data sharing processes or advertising use of health insights. These are structural gaps, not theoretical concerns.
Why the EU Delay Matters
I’m not opposed to AI in healthcare. I’m opposed to launching products that do not meet the data protection standards we’ve built in Europe.
The fact that ChatGPT Health is not available in the Netherlands yet gives us time. Time for OpenAI to build compliance structures that meet AVG requirements. Time for regulators to evaluate the product. Time for entrepreneurs and consumers to understand what they’re consenting to.
The Reality of Third-Party Data Control
Corynne McSherry, legal director for the Electronic Frontier Foundation, puts it clearly: “If you give your data to any third party, you are inevitably giving up some control over it and people should be extremely cautious about doing that when it’s their personal health information.”
What Will Happen
Data breaches will happen. Companies will comply with subpoenas and warrants. Terms of service will change.
These are not possibilities. These are certainties.
The question is whether you operate in a jurisdiction that gives you structural protections when those certainties occur.
In the Netherlands, you do. In the US, you do not.
Jurisdictional Protection: The EU delay provides time for compliance structures, regulatory evaluation, and informed consent. Dutch jurisdiction provides structural protections absent in US law.
What Should Expat Entrepreneurs Do?
If you’re an expat entrepreneur in the Netherlands, here’s your decision rule:
Do not upload sensitive health data to platforms that do not comply with GDPR.
If You Use AI for Health Questions
- Use it for general information only
- Do not connect your medical records
- Do not link third-party health apps
- Do not create aggregated health profiles on platforms outside EU jurisdiction
If You’re Building a Health Tech Product
Build GDPR compliance into your architecture from day one. Data protection is not a feature you add later. It’s a structural requirement that determines whether your product operates in the EU market.
If You’re Advising Clients on Digital Tools
Help them understand jurisdiction. The platform’s marketing language matters less than where the data lives and which laws govern it.
Final Rule: Structure is cheaper than recovery. When a major tech company avoids the EU market, that’s not a business decision. It’s a compliance signal. Pay attention to what companies avoid. It tells you what they cannot yet deliver.
Frequently Asked Questions
Why is ChatGPT Health not available in the Netherlands or EU?
ChatGPT Health does not meet GDPR and AVG data protection standards required for processing health data in the European Economic Area, Switzerland, and the UK. The product’s data handling practices do not comply with requirements for explicit consent, purpose limitation, data minimization, and accountability chains mandated under European law.
What happens to HIPAA protection when I upload medical records to ChatGPT Health?
Medical records uploaded to ChatGPT Health lose HIPAA protection because ChatGPT provides technology services outside HIPAA’s scope. No federal regulatory body governs health information provided to AI chatbots. You operate under OpenAI’s terms of service, which change at any time.
What are the biggest risks of using ChatGPT Health?
The primary risks include data aggregation creating high-value breach targets, loss of control when third-party apps access your data, inability to fully delete shared information, absence of end-to-end encryption, changeable terms of service, and removal of AVG legal protections when data moves outside EU jurisdiction.
Does disconnecting third-party apps delete my health data?
No. Disconnecting an app stops future sharing but does not delete data already collected. Once data has been shared with a third party, complete deletion becomes nearly impossible. Under AVG, you have the right to erasure, but enforcing this across multiple US-based companies operating outside EU jurisdiction is difficult.
Does ChatGPT Health use end-to-end encryption?
OpenAI has not confirmed end-to-end encryption for ChatGPT Health. The company mentions encryption “at rest and in transit,” which is standard. Without end-to-end encryption, OpenAI can access your health data. End-to-end encryption means only you can decrypt your data, not the platform provider.
What should I do if I need AI help with health questions?
Use AI tools for general health information only. Do not connect medical records or third-party health apps. Do not upload sensitive health data to platforms outside EU jurisdiction. If you need personalized medical advice, consult healthcare professionals operating under proper data protection frameworks.
What questions should I ask before using digital health tools in my business?
Ask where data is stored, who can access it, what happens when you delete data, whether the provider complies with GDPR, and what happens if the company changes ownership or goes bankrupt. EU-based servers provide stronger legal protections than US-based storage.
What does AVG require for processing health data?
AVG requires explicit consent, clear purpose limitation, demonstrable data minimization, and proof of where data goes, who processes it, and how long it’s retained. When multiple parties process data, each becomes a data controller or processor, creating a chain of accountability the Autoriteit Persoonsgegevens can investigate.
Key Takeaways
- ChatGPT Health’s exclusion from the Netherlands and EU signals the product does not meet GDPR and AVG data protection standards for special category health data.
- Health records uploaded to US-based platforms lose HIPAA protection and operate under changeable terms of service without comprehensive federal privacy law protections.
- Data aggregation on centralized platforms creates high-value breach targets. Healthcare breaches in 2025 averaged 71,276 records per incident with average costs of €9.4 million.
- Third-party app connections create permanent data exposure. Disconnecting apps stops future sharing but does not delete data already collected.
- Jurisdiction determines your legal protections. Dutch and EU data subject rights become theoretical once health data crosses outside EU jurisdiction.
- Expat entrepreneurs in the Netherlands should not upload sensitive health data to platforms that do not comply with GDPR. Structure is cheaper than recovery.
- When major tech companies avoid the EU market, this signals compliance gaps, not business strategy. Pay attention to what companies avoid.










