News
Pittsburgh Doctor Warns Against Using ChatGPT for Medical Advice
Source: cbsnews.com
Published on October 9, 2025
Updated on October 9, 2025

Pittsburgh Doctor Warns Against Using ChatGPT for Medical Advice
The rise of artificial intelligence (AI) tools like ChatGPT has revolutionized various sectors, but one Pittsburgh doctor is cautioning against their use in medical advice. Dr. James Solava, Medical Director of Information Technology at AHN, emphasizes that while AI has its merits, it cannot replace the expertise of trained physicians.
"AI tools like ChatGPT can provide information, but they lack the ability to perform critical tasks like physical examinations or ask the right follow-up questions," Dr. Solava explained. "This makes them unsuitable for diagnosing serious conditions where time is of the essence."
The Limitations of AI in Healthcare
One of the major concerns with relying on AI for medical advice is the phenomenon of \"hallucinations.\" These are instances where the AI generates incorrect or misleading information. For conditions like chest pain or stroke symptoms, such inaccuracies could have life-threatening consequences.
"ChatGPT aims to please, which means it might provide information that sounds correct but is actually harmful," Dr. Solava warned. "It doesn't have the ability to assess a patient's condition through physical touch or observation, which are crucial in accurate diagnosis."
The Importance of Human Doctors
Human doctors bring years of medical training and experience to the table. They can listen to a patient's heart and lungs, feel their abdomen, and ask questions that AI cannot replicate. These skills are essential in providing personalized and accurate medical care.
"AI can be a useful tool for minor health issues, but it should never replace professional medical advice," Dr. Solava stated. "In critical situations, seeking help from a qualified physician is essential."
Responsible Use of AI in Medicine
While AI has its limitations, it can still play a role in healthcare. For instance, it can provide general information about minor health concerns or help patients understand basic symptoms. However, it should never be used as a substitute for professional medical advice.
"Patients should use AI responsibly and always consult a doctor for serious health issues," Dr. Solava advised. "The expertise and personal touch of a human doctor cannot be replaced by technology."
Conclusion
The warning from Dr. Solava serves as a reminder of the importance of human doctors in healthcare. While AI tools like ChatGPT have their uses, they cannot replace the critical role of trained physicians in diagnosing and treating medical conditions. Patients are advised to seek professional medical help for serious health concerns and use AI tools with caution.