New tool protects open source AI from malware and code compromise

Share post:

In the digital age, a new kind of Trojan horse has emerged in the form of AI models laced with malicious code. The AI community got a jolt from Protect AI’s revelation that a staggering 3,354 models on Hugging Face, a go-to AI model depot, contained potential malware or compromised code.

Worse, it also appeared that Hugging Face’s security scans missed the threats in a third of these compromised models.

This has led a company called Protect AI to develop a scanner tailored to detect malware and compromised code in open source AI models.

Open source AI models are gaining in popularity given the costs associated with building and training a proprietary model.

This has made platforms like Hugging Face incredibly popular but, if Project Ai’s numbers are correct, it has also made them a potential source of compromised AI code.

Protect AI’s scanning software is one potential tool to detect these issues and ensure the safety of open source AI models.

How will Protect AI keep up to date on threats? They have acquired a bug bounty program aimed at AI models called Huntr which they hope will provide them with continuing insights into new threats as they evolve.

Sources include: Axios

Featured Tech Jobs

SUBSCRIBE NOW

Related articles

Cyber Security Today, Week in Review for week ending Friday, March 1, 2024

This episode features a discussion on how hard it is to kill a ransomware gang, Canada's proposed new online harms bill, why organizations still allow staff to use vulnerable software

Healthcare sector “stretched thin” in fight against cyber attacks warns CSO of Health-ISAC

In an interview Errol Weiss talks about the challenges facing hospitals a

Cyber Security Today, March 1, 2024 – Warnings to GitHub users and Ivanti gateway administrators, and more

This episode reports on a recommendation that enterprises drop Ivanti Policy Secure and Connect Secure devices because threat actors can get around mitigations for recent vulne

Lawsuit requires Pegasus spyware to provide code used to spy on WhatsApp users

NSO Group, the developer behind the sophisticated Pegasus spyware, has been ordered by a US court to provide...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways