ChatGPT can now alert trusted contacts over self-harm concerns
08 May 2026
OpenAI has introduced a new safety feature called "Trusted Contact" in ChatGPT. The tool lets adult users designate an emergency contact for mental health and safety concerns.
If OpenAI's systems detect that a user may have talked about self-harm or suicide with the chatbot, ChatGPT first encourages the user to reach out, and then a small team of specially trained reviewers assesses the situation.
A Trusted Contact will be notified only if that review determines there are serious safety concerns.
How to set up trusted contacts
Feature explanation
The Trusted Contact feature is an opt-in service that any adult ChatGPT user can enable.
To do this, they have to provide the contact details of another adult (18+ globally or 19+ in South Korea) in their ChatGPT account settings.
The chosen Trusted Contact must accept the invitation within a week of receiving it.
Users can modify or remove their selected contact anytime, and the Trusted Contact can also opt-out at any time.
Trusted Contact won't get chat details
Privacy measures
OpenAI has assured that the notifications sent to Trusted Contacts are "intentionally limited" and do not include chat details or transcripts.
If a user talks about self-harm, ChatGPT will suggest they reach out to their Trusted Contact for help. It will also inform them that their contact may be notified.
A small team of specially trained people at OpenAI will review such cases and send an email, text message, or in-app notification if serious safety concerns are found.
Responding to lawsuits over mental health conversations
Industry response
The introduction of the Trusted Contact feature comes as AI companies face increasing scrutiny over chatbot safety, emotional dependency, and their responses during crises.
The move also comes after several lawsuits accused ChatGPT of pushing emotionally vulnerable users toward self-harm or suicide.
OpenAI has stressed that every serious alert is reviewed by a human reviewer and aims to do so quickly, usually within an hour.
Developed in consultation with mental health experts
Development process
The development of the Trusted Contact feature involved collaboration with clinicians, researchers, and organizations focused on mental health and suicide prevention.
OpenAI's Global Physicians Network of over 260 licensed physicians across 60 countries and its Expert Council on Well-Being and AI also contributed to this initiative.
The company worked closely with external organizations such as the American Psychological Association during the development process.
Please seek help if you're having suicidal thoughts
Mental health
If you or anyone you know is suffering from suicidal thoughts, you can reach out to AASRA for suicide prevention counseling. Its number is 022-27546669 (24 hours).
You can also dial Roshni NGO at +914066202000 or COOJ at +91-83222-52525.
Sneha India Foundation, which works 24x7, can be contacted at +91-44246-40050, while Vandrevala Foundation's helpline number is +91-99996-66555 (call and WhatsApp).
-
'Sheikh Mohamed had coffee at my house': How UAE President stays close to people

-
JoSAA Counselling 2026: How to Secure a Top NIT Even with a Low Rank? Simply Master This '50% Formula'..

-
NCW report alleges toxic culture and POSH failures at TCS Nashik unit

-
Priyanka Chopra's daughter Malti Marie guesses her 'mom's favourites' perfectly at four

-
New AI Tool Arrives on WhatsApp Business: Create Product Descriptions and Images in Minutes—Here's How to Use It..
