Microsoft Catches APTs Using ChatGPT For Vuln Research, Malware Creation

Updated on February 15, 2024
Microsoft Catches APTs Using ChatGPT For Vuln Research, Malware Creation

Microsoft threat hunters state that foreign APTs are interacting with OpenAI’s ChatGPT for the automation of malicious vulnerability research, target reconnaissance, and malware creation tasks.

In a report published on Wednesday, Microsoft states that it joined forces with OpenAI to study the use of LLMs by malicious actors and found various known APTs that were experimenting with ChatGPT to learn about potential victims, improve malware scripting tasks and go through public security advisories. Microsoft also said it caught hacking teams from Russia, North Korea, China, and Iran using LLMs in their active APT operations.

In one such case, Microsoft said it caught North Korean APT Emerald Sleet (aka Kimsuky) using LLMs to generate content most probably used for spear-phishing campaigns. Additionally, Pyongyang hackers got caught using LLMs to understand the vulnerabilities known to the public, troubleshoot technical issues, and for help in using various web technologies. Microsoft stated:

“Interactions have involved requests for support around social engineering, assistance in troubleshooting errors, .NET development, and ways in which an attacker might evade detection when on a compromised machine.”

Redmond also found evidence that ATP groups were using generative AI technology to understand better the reported vulnerabilities like the CVE-2022-30190 Microsoft Support Diagnostic Tool (MSDT) vulnerability known as “Follina.”

In these cases, Microsoft worked with OpenAI to disable all accounts and assets linked to the advanced threat actors.

Was this article helpful?
Thanks for your feedback!

About The Author

Urfa Sarmad

Urfa is a business management graduate who delved into the world of tech, data privacy and cybersecurity and has been writing tech and privacy related content ever since. In her free time.

No comments were posted yet

Leave a Reply

Your email address will not be published.


CAPTCHA Image
Reload Image