MIT Sloan blames third-party AI tools for 55% of AI failures

Share post:

A new study by MIT Sloan Management Review and Boston Consulting Group has found that third-party AI tools are responsible for over 55% of AI-related failures in organizations.

These failures can have serious consequences, including reputational damage, financial losses, loss of consumer trust, and even litigation.

The study surveyed 1,240 respondents across 87 countries, and found that 78% of companies use third-party AI tools. Of these organizations, 53% use third-party tools exclusively, without any in-house AI tech. However, despite the widespread use of third-party AI tools, only 20% of companies have evaluated the substantial risks they pose.

The researchers concluded that responsible AI (RAI) is harder to achieve when teams engage vendors without oversight, and a more thorough evaluation of third-party tools is necessary.

“Enterprises have not fully adapted their third-party risk management programs to the AI context or challenges of safely deploying complex systems like generative AI products,” Philip Dawson, head of AI policy at Armilla AI, told MIT researchers. “Many do not subject AI vendors or their products to the kinds of assessment undertaken for cybersecurity, leaving them blind to the risks of deploying third-party AI solutions.”

The researchers recommend that organizations implement thorough risk assessment strategies for third-party AI tools, such as vendor audits, internal reviews, and compliance with industry standards. They also believe that organizations should prioritize RAI from regulatory departments up to the CEO.

The research found that organizations with a CEO who is involved in RAI are 58% more likely to report business benefits than those with a CEO who is not directly involved in RAI. They are also almost twice as likely to invest in RAI.

The sources for this piece include an article in ZDNET.

Featured Tech Jobs


Related articles

Robot startup uses ChatGPT to enhance its communications and reasoning skills

Humanoid robot startup Figure has secured a significant $675 million investment from a group of high-profile investors, including...

Lawsuit requires Pegasus spyware to provide code used to spy on WhatsApp users

NSO Group, the developer behind the sophisticated Pegasus spyware, has been ordered by a US court to provide...

OpenAI claims New York Times manipulated ChatGPT “fabricate data”

OpenAI has challenged the New York Times' copyright lawsuit, asserting the newspaper manipulated ChatGPT to fabricate evidence. The...

Companies experiment with four day work week enabled by AI

The dream of a four-day workweek is becoming a reality for some, thanks to AI's integration into the...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways