Today's AI/ML headlines are brought to you by ThreatPerspective

Digital Event Horizon

A Case of Deception: The Story Behind a Malicious AI Software Hack




A California man has pleaded guilty to using malicious AI software to hack a Disney employee by tricking them into installing a fake extension on their computer. The victim's sensitive information was stolen, including private company material and personal data. This incident highlights the importance of being vigilant when using AI software and the need for companies to implement robust security measures to protect sensitive data.

  • Ryan Mitchell Kramer used malicious AI software to hack a Disney employee, using an app on GitHub to trick employees into installing a fake extension.
  • Kramer accessed private Disney Slack channels and downloaded over 1.1 terabytes of confidential data, including personal and company information.
  • He used phishing tactics and contacted the employee pretending to be a hacktivist group to gain access to sensitive data.
  • The incident highlights the importance of robust security measures to protect sensitive data and being vigilant when using AI software.



  • The use of artificial intelligence (AI) has become increasingly prevalent in our daily lives, from image generation to password management. However, the darker side of AI has also emerged, as seen in the case of Ryan Mitchell Kramer, a 25-year-old California man who pleaded guilty to using malicious AI software to hack a Disney employee.

    In April 2024, Kramer published an app on GitHub for creating AI-generated art, which he used to trick employees into installing a malicious version of the widely used open source AI image generation tool, ComfyUI_LLMVISION. The fake extension contained malicious code that gave access to computers that installed it, allowing Kramer to download sensitive information from thousands of private Disney Slack channels.

    Kramer's plan was to gain unauthorized access to the victim's computer and online accounts, after which he accessed private Disney Slack channels. In May 2024, he downloaded roughly 1.1 terabytes of confidential data from these channels, including private company material, the employee's bank, medical, and personal information.

    To make his scheme more convincing, Kramer contacted the employee and pretended to be a member of a hacktivist group. After receiving no reply, Kramer publicly released the stolen information, which not only included private Disney material but also the employee's sensitive personal data.

    The case is a prime example of how malicious actors can exploit vulnerabilities in AI software to gain unauthorized access to sensitive information. The use of fake extensions and phishing tactics allowed Kramer to deceive employees into installing the malicious code, ultimately leading to the theft of sensitive data.

    Law enforcement agencies have taken notice of this incident, with the FBI investigating Kramer's activities. As a result, Kramer has pleaded guilty to one count of accessing a computer and obtaining information and one count of threatening to damage a protected computer.

    The incident highlights the importance of being vigilant when using AI software and the need for companies to implement robust security measures to protect sensitive data. It also serves as a reminder that even seemingly legitimate apps can be used for malicious purposes if not thoroughly vetted.

    In conclusion, the case of Ryan Mitchell Kramer serves as a cautionary tale about the potential risks associated with the use of AI software. As AI technology continues to evolve and become more prevalent in our daily lives, it is essential that we remain vigilant and take steps to protect ourselves against cyber threats.



    Related Information:
  • https://www.digitaleventhorizon.com/articles/A-Case-of-Deception-The-Story-Behind-a-Malicious-AI-Software-Hack-deh.shtml

  • https://arstechnica.com/ai/2025/05/man-pleads-guilty-to-using-malicious-ai-software-to-hack-disney-employee/


  • Published: Mon May 5 20:25:12 2025 by llama3.2 3B Q4_K_M











    © Digital Event Horizon . All rights reserved.

    Privacy | Terms of Use | Contact Us