This week, Spark got a reality check as to what can happen when you advertise on YouTube, where algorithms feed viewers videos to keep them hooked on the site.
That includes the YouTube algorithm recommending videos of children for sick monsters to view, who would then comment on the clips. Absolutely horrific stuff, and Spark is one of several large companies to yank their ads from YouTube.
It's not the first time the "algo" has done exactly what it's designed to do, and it most likely won't be the last either.
Money talks and Google listens when advertisers stop spending.
The internet giant is promising to fight disinformation and to tweak the recommendations (again) to minimise the lies, distortions and outright dangerous material that's being served up.
That, however, is an after-the-fact reaction which does nothing to explain why the rubbish videos are on YouTube in the first place — or why comments to them are allowed when the moderation of what's being said is so poor.
It's not just the videos that can be fake on YouTube. Security vendor RiskIQ recently detailed how easy it is for scammers to impersonate YouTube celebrities with millions of subscribers to trick them into clicking on fraudulent sites.
The impersonation scam's been going on for three years, RiskIQ reckons.
Will Google's tweaks and changes make things better so that YouTube becomes a safer place for everyone, advertisers included? Probably not unless the goal of the algorithm is changed fundamentally.
Former YouTuber Guillaume Chaslot of Algotransparency.org worked on the artificial intelligence system powering the recommendations, and he's not hopeful as the algorithm creates a feedback loop.
For instance, for depressed people who hang out on YouTube a lot, the site will recommend often terrible material. Such content gets more views, which means there's an incentive for other people to make more of it, which YouTube duly feeds back to depressed viewers.
Likewise, deleting millions of comments and hundreds of channels is unlikely to make the AI change its mind and stop recommending videos that appeal to child abusers as it is designed to increase time spent watching clips, Chaslot noted.
It is a shame, really. YouTube — and other internet video sites — can be great as archives of material that would never be discovered or forgotten.
There is plenty of useful and informative content on YouTube as well amid the dross.
Even the comments are, at times, funny and informative. The irony here is that without algorithms surfacing content that you might be interested in, it would be very difficult to find what you want. People aren't designed to browse databases.
Google should think about that and change YouTube's AI away from maximising views at all costs.
Whether or not that happens remains to be seen, as it would cost Google ad revenue and chances are it'll happen through regulation after the next few scandals that are waiting in the wings.