Click For Photo: https://i.dailymail.co.uk/1s/2019/08/21/17/17533512-0-image-a-17_1566404089297.jpg
YouTube will remove any 'mature' or 'violent' content directed toward children amid mounting pressure to make its platform safer for minors.
According to The Verge, YouTube says it will weed out unsafe content by monitoring video titles, descriptions, and tags and will begin banning offenders following a grace period.
Content - Material - Sex - Violence - Death
Targeted content will include any material that touches on sex, violence, death or other topics deemed inappropriate for young audiences.
While it may be odd to think that a platform of YouTube's size hadn't already been moderating 'violent' and 'mature' content directed towards kids, The Verge notes that up to this point YouTube has only age-restricted access.
YouTube - Change - Days - YouTube - Help
YouTube reportedly announced the change two days ago, quietly, through a YouTube Help community forum.
The platform said it will remove content if and when it's found, but won't give out any 'strikes' to creators until after a 30-day notice period meant to familiarize users with the policy.
Videos - Rule - Change - Strikes
Videos uploaded prior to the rule change, however, will not be given strikes, though they can still be removed.
As a part of the push, YouTube will also be age-restricting other types of content that they fear could be misconstrued as being for kids such as adult cartoons.
Example - Platform - Towards - Children - Matter
An example, said the platform, would be cartoon directed towards children that depicts inappropriate subject matter like a character 'injecting needles.'
On the heels of an undisclosed settlement with Federal Trade Commission (FTC) on alleged breaches of the Children’s Online Privacy Act (COPPA), YouTube also recently agreed to...
Wake Up To Breaking News!
Sorry Mr. Franklin, we couldn't keep it.