The feature was first launched in August and is now rolling out to more users, according to The Verge (via). Google says it will remove images of minors unless it’s a “case of compelling public interest or newsworthiness.” Google’s support page indicates that this policy would only apply to individuals currently 18 or younger. Requests can be put in by minors, parents, guardians, or legal representatives from the company’s support page. Google will ask for the image URLs you want to be removed from its search results. Users also have to submit information such as the name and age of the minor, as well as the name of the individual representing them. However, the company points out that it cannot remove web URLs or pages containing text and images.

If a user’s removal request is denied, they will have a chance to re-submit their request later

Moreover, users can’t request the removal of images that are hosted on other websites. Google recommends getting in touch with the site’s webmaster to initiate content removal in such cases. There’s also a guide included with steps to contact a webmaster. So what happens when the content removal request is denied? Google says it will “include a brief explanation” while informing the user of the news. Recently, YouTube launched a new mechanism to track the views on videos that violate the site’s policies. Known as Violative View Rate (VVR), this metric helps the company ascertain the progress it has made in removing such content. Google will share stats on VVR every quarter in the YouTube Community Guidelines Enforcement Report. We also learned that YouTube removed a total of 2,055,515 channels in 2020 over policy violations. These violating channels had a total of 51,078,806 video uploads.