Sony’s AI agent Gran Turismo Sophy (GT Sophy) has finally defeated the human competition of the PlayStation racing game Gran Turismo. Sony and other corporations have used games to improve AI.
In order to successfully deploy GT Sophy, various components and data have been brought together, including basic AI research, a hyper-realistic real-world racing simulator, and infrastructure for extensive AI training.
While the AI raced against the top four Gran Turismo drivers in July, it was only able to outdo the human drivers in another race in October, after learning from the previous race.
Unlike in the past, when AI defeated humans in the games of chess, Mahjong, and Go, GT Sophy seems to have overcome more difficult technical challenges, including mastering real-time racing car driving.
“It took about 20 PlayStations running simultaneously for about 10 to 12 days to train GT Sophy to race from scratch to superhuman level,” said Peter Wurman, director of Sony AI America and head of the team that designed the AI.
For more information, read the original story in Reuters.