Google announced that parents will now be able to ask for their children’s images to be removed from Google search results as it would give them “more control over their digital footprint.”
Other changes to the company include stopping ad targeting based on children’s age, gender or interests and preventing “age sensitive” types of ads from being shown to younger users.
The company will also change the default mode for uploaded videos for children to “the most private option,” turning adult-filtering mode Safe Search on for minors, prevent teenagers from using location history, a feature that constantly tracks and logs a phone’s location and provide new parental advice on the Google Play Store.
Commenting on the latest development, the tech giant said: “Of course, removing an image from search doesn’t remove it from the web but we believe this change will help give young people more control of their images online. Some countries are implementing regulations in this area – and as we comply with these regulations, we’re looking at ways to develop consistent product experiences and user controls for kids and teens.”
Many large technology companies have adopted such measures under scrutiny from governments and security advocates.
Instagram, for example, has made the accounts of under-16s private by default as it battles plans to introduce a children’s version of the app.
For more information, read the original story on the BBC.