Lawrence Abrams* reports that online video platform YouTube is to take more care recommending its content.
YouTube has announced that they will reduce recommendations of videos that promote misinformation and conspiracy theories such as the earth being flat or 9/11 never happening.
While these videos do not necessarily break YouTube’s policies, they do come pretty close.
For this reason, YouTube will begin reducing its recommendations for videos that fall into this category through machine learning and human reviewers.
“To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways – such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
This does not mean that this content will be removed from YouTube or banned from being uploaded in the future, but it does mean they will no long recommend them as videos that you may find interesting.
YouTube feels that this create a balance between providing free speech on their platform, while not contributing to the spread of misinformation.
“To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube,” the blog post continues.
“As always, people can still access all videos that comply with our Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results. We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.”
Recommendation engine still needs work for kids
Recent research shows that the recommendation engine needs a lot of work when it comes to kids.
According to new research, researchers found that YouTube’s recommendation engine had a 45 per cent chance of eventually being recommended inappropriate videos to children.
“Following their extensive tests, the deep learning video content classifier was able to reach an accuracy of 82.8 per cent and it also helped them reach the conclusion that toddlers watching videos on the YouTube platform have a 45 per cent chance of being suggested an inappropriate one within 10 hops if starting “from a video that appears among the top 10 results of a toddler-appropriate keyword search (e.g., Peppa Pig).”
These same researchers developed an algorithm that could accurately detect inappropriate videos for children 82 per cent of the time.
* Lawrence Abrams is the creator, owner and Editor in Chief of BleepingComputer.com.
This article first appeared in www.bleepingcomputer.com