They’e created by a channel called “BillionSurpriseToys”. They have an official twitter
3 things: 1) They have their location set as the UAE, so we know what country these are coming from
2) They joined in October 2015, which means they’ve been around for a while, even before the “elsagate” thing was big
3) they have a website
When I loaded up the website, google immediately let me know that the connection was not secure
and….. my adblock blocked 84 popups/ads
what the hell. so there’s obviously something fishy with this page
they have a shop….. okay
where you can buy… “merchandize”…. which is six of the identical t-shirt
They also have an about page… filled with… more grammatical/spelling errors
“technology dominates ou kids’ lives”
“inculate”??????
They also have a blog…. with generic parenting-related topics
most of these articles are posted multiple times with different titles
These posts seem suspiciously too competent for the usual phrasing on this website, and are from the pespective of parents. careful googling reveals that they’re actually from a mommy blog, here , whose content was just wholesale stolen.
…..okay
on another note, they have a character page with… this guy
…… i hate it
anyways, at the bottom of the page, there’s a link to the hosting company they use.
This redirects to what looks like a normal webhosting/media management company... but after reading it…. it has the same text as the billion surpise toys company. its some kind of shell.
I looked up the company…
They’re a 3d animation company in India pretending to be a media/web hosting company for a youtube channel based in the UAE? ?? what is happening??? also btw they’re hiring!
This website also has a privacy policy page
An note.
The website also has the same layout as the first website… but with a buuunch of broken links.
….except for these, which all link to the main website instead of their profile. okay
In conclusion: What the fuck ?
So, I’ve been dealing with these videos for years now at work, and the simple answer is that they’re ad revenue generators.
Make enough of them that are similar enough and pass kid-safe standards (which they all do because they’re brightly colored and nonviolent, two of the major standards), and you can get the YouTube autoplay algorithm to put your videos on basically infinitely, until someone actually changes the channel. The next recommended video, based on content and audience similarity, will just be another one of your “lullaby educational kids song 3 hours” monstrosities, indefinitely.
A kid, babysitter, or exhausted parent clicks on one of them. It plays. Ads appear every 7 minutes (the maximum amount before additional screening for spam kicks in). The kid wanders off without turning off the console or computer, just the screen. Now you get ad revenue every 7 minutes until that machine turns off or gets used again, probably overnight or maybe even a full day, because every next autoplay is another of your videos. And every video that plays further locks in the autoplay.
Channels set up these nested shell companies so their channel can become verified as an official business channel, which automatically reduces the amount of screening you undergo, too.
They tend to be based in India because there’s a sufficiently skilled talent base and resource base to produce the hundreds of hours of just distinct enough visuals. In many cases, titles are generated algorithmically based on what pulled the most revenue the week before and to search terms for the last few days. Videos are then created to match the algorithmic titles, hence the bizarre combinations of topics.
This helps get these videos to the top of search results for popular kids searches, increasing the chance that the infinite recommendation loop gets started by the largest possible number of people.
I admit, if you haven’t been wading through this shitshow professionally for a few years, it probably looks pretty creepy.
But it’s not a cult or a conspiracy.
It’s just capitalism.