Medical Chatbot Hacked Into Giving Dangerous Advice

featured

Security researchers have demonstrated that a healthcare AI chatbot used in a US medical pilot can be manipulated into producing dangerous advice and misleading clinical notes, raising new questions about how safely AI can operate inside real healthcare systems.

What Happened?

Doctronic is a US telehealth platform built around an AI medical assistant (a medical chatbot) designed to help patients understand symptoms, manage conditions and connect with licensed doctors. The system is intended to act as a first point of contact in a digital care pathway, gathering patient information, offering guidance and preparing summaries for clinicians.

Continue reading ...

MSP Members Only

...Free MSP Standard Access Required

Thank you for reading MSP Marketplace Create your FREE account or login to continue reading

Your Advert here?

Click here to find out about sponsorship