Loading Events

Towards building AI Assistants for Doctors: Vision, Language, and Interaction

June 18 @ 11:00 am - 12:00 pm

[]

Medical imaging, including magnetic resonance imaging (MRI) and computed tomography (CT), is central to modern diagnosis and treatment planning. Although contemporary radiological AI systems can achieve fast and accurate diagnoses, most offer limited user interaction within clinical workflows. This gap hinders adoption by reducing transparency, adaptability, and trust. This talk will showcase research from our Health-X Lab on building radiological AI systems designed to collaborate with clinicians through intuitive and natural interaction channels. Specifically, I will highlight three lines of work: leveraging human visual attention during radiological reading to better align AI models with expert behavior; enabling flexible, language-based interaction with medical images; and developing interactive AI agents that support real-time, user-driven analysis. Together, these approaches illustrate how integrating visual perception, language, and interaction can transform AI from passive tools into effective clinical co-pilots, enhancing usability, interpretability, and seamless workflow <a href="http://integration.

Room:” target=”_blank” title=”integration.

Room:”>integration.

Room: 321, Bldg: Duff Medical Building, 3775 Rue University, Montreal, Quebec, Canada, Virtual: https://events.vtools.ieee.org/m/558822