A shred of sanity from YouTube

YouTube has long been criticized for its secret algorithms and tendency to go down some pretty disturbing rabbit holes, like when children are served content on YouTube that features disturbing or violent content.
As one writer recently remarked:

“Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatize, and abuse children, automatically and at scale.”

And it’s not just children who are being manipulated by this: plenty of adults have fallen prey to misleading conspiracy theories or inadvertently watched something vile because YouTube’s algorithm thought fit to play it next.
Now, the ubiquitous platform is finally taking some steps to rein in the problem. The New York Times reports:

In a blog post, YouTube said it would no longer suggest videos with “borderline content” or those that “misinform users in a harmful way” even if the footage did not violate its community guidelines.

We’re always alert to anything that reeks of conservative censorship and none of the major tech platforms have a very good track record on free speech issues near and dear to conservatives’ hearts.

Facebook and Twitter have been the most egregious, with their takedowns of conservative commentators’ accounts with no warning and flimsy reasoning. But YouTube (owned by Google’s parent company) hasn’t gotten any gold stars either as far as conservatives are concerned.

But for concerned citizens who want our children protected from filth and exploitation and who prefer to win over supporters with the sound logic of small government and limited regulation rather than by tricking them with baseless conspiracies, the news from YouTube is a welcome development.
The policy seems balanced, as content in this new category will still be recommend to users who subscribe to a channel with that type of content (i.e. conspiracy theorists can sign up to receive more conspiracy theories) and the videos will still show up in search results.

But, you won’t get sucked down a vortex of one video leading to the next and end up on something questionable via YouTube’s “recommendations” (i.e. the heavily-guarded algorithm that determines what content it will recommend next to user).

A shred of sanity in an otherwise depressing tech landscape.