ChatGPT barred from giving health, legal advice? OpenAI says no

New policy specifies that ChatGPT cannot provide advice requiring professional licence, such as legal or medical guidance, unless licensed professional is involved
OpenAI and ChatGPT logos are seen in this illustration taken, February 3, 2023. — Reuters

OpenAI and ChatGPT logos are seen in this illustration taken, February 3, 2023. — Reuters

OpenAI has addressed false claims circulating online that ChatGPT has been barred from providing legal and medical advice, a clarification which has cleared the air filled with speculations.

These rumours surfaced after a ChatGPT policy update, which rolled out on October 29 and included guidelines on the chatbot's limitations.

Karan Singhal, OpenAI’s head of health AI, clarified on X (formerly known as Twitter) that these assertions are “not true.” He stressed that while ChatGPT is not a replacement for professional advice, it is a valuable resource for understanding legal and health-related information.

The new policy specifies that ChatGPT cannot provide tailored advice requiring a professional licence, such as legal or medical guidance, unless a licensed professional is involved.

The policy change in question is not a recent one, as it reflects existing guidelines that have always discouraged users from seeking tailored legal, medical, or financial advice without professional oversight.

Previously, OpenAI maintained three separate policies for different services, but the latest update set these into a unified set of rules.

Notwithstanding the restructuring, the main principles remain unchanged, ensuring that users are reminded of the limitations of AI assistance in sensitive areas like health and law.

OpenAI's clarification aims to reassure users that while ChatGPT can assist in providing information, it should not replace professional advice.