This doesn't solve the number one problem of kids content on YouTube -- they are leaving it up to the creator to decide if it's kids content. It's still not human curated.
The nice thing about Netflix/Hulu/Amazon/PBS/etc. is that a human decides if the content is for kids.
YouTube is still trying to get away with "letting the machines do it", which works for most aspects of their business, but not this.
I used to let my kid watch YTKids while I supervised. But no more. Even with my supervision it just jumps to some insanely inappropriate video, too quick for me to shut it off in time, and it's already done damage.
When a scary clown jumps out and screams, that's traumatizing, even just for a moment.
I have no trouble letting them watch all those other platforms. Heck, I let them watch PBS unsupervised.
But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.
>YouTube is still trying to get away with "letting the machines do it", which works for most aspects of their business, but not this.
Citation needed. I mean, they do use AI for lots of stuff, but let's not describe it as working. It has routinely caused massive aggravation for their biggest content creators, it has been one in a group of changes that has basically made Youtube an unsafe monetization platform and driven the creation of places like Patreon,
Sure, this is the only AI fuck up that is going to get them sued in federal court and face charges of exploiting children, but it's not like the other AI stuff is working well. Basically, their AI is doing a bad job everywhere, it's just that this area is where they face legal liability and so they can't pull out their usual "but the machines told us to!" defence.
Heh, I was giving them the benefit of the doubt. They've been running on "algorithms" since 2005, and are still around, so I assume it works for them for some definition of working.
But you have a point. YouTube and Google get a lot of flack for things that are emergent properties due to their lack of humans.
YouTube doesn't release many of the details of the methodology or even the content that they block because it can then be circumvented by the people who create and/or circulate it. It can create a streisand effect which only makes that type of content worse by drawing attention to it.
Some of what you can see is in the child sexual abuse material (CSAM) APIs that Google is now sharing to help other organizations detect CSAM material en-masse in their content. These services that were originally built to do so on GMail, Drive, and Youtube.
AI is really bad ad making judgement calls on things like fair use, advertiser friendliness, controversial content (eg. LGBTQ content), and now whether or not a video is directed to kids. See https://www.cs.princeton.edu/~arvindn/talks/MIT-STS-AI-snake...
To be fair, they're the largest non-pornographic video platform. Making them an endless target for any and all abuse possible to grow channels quickly and earn ad revenue.
It's inevitable that in such an environment the system they come up with ends up making less and less sense over time. It's death by a thousand cuts, and I don't think there's a good way to really avoid it.
On the one hand because they are this large, they cannot currate content. On the other hand they are arguing because they are this large they are the only ones who can currate this content.
I would feel more sympathetic to your argument if the companies wouldn't argue both cases at the same time.
> I would feel more sympathetic to your argument if the companies wouldn't argue both cases at the same time.
I don't really care about sympathy, so much as I care about reality. The companies will do as all companies do. Try to make money by garnering confidence in their product & abilities. Even if they have to lie through their teeth.
Reality is, there isn't a good system (currently) to really curate massive amounts of user submitted video content, in an environment where anyone anywhere can submit said video content.
The problem's scale is always going to lead towards either a compromise of being an open video platform, or a compromise in relation to curation.
Did the inappropriate content happen with a certain age range set? I totally believe this happens / feel like a bad parent for not cutting it off, but we have the app set on 2 year old and under and I’ve seen a lot of dumb stuff but nothing inappropriate yet. I’m just curious if this becomes an issue that boils down to where you set the age range or if I’m just playing an epically long game of Russian roulette right now.
A friends 1 year old was watching YT unsupervised. Some of the videos were super creepy. Like cockroaches coming out of LEGO bricks and being smashed (coackroach blood and things) and then they reversed and the cockroaches multiplied and took over LEGO tower. Kid was super memsmerized. I was totally WTF-ed.
100% agree. YT kids is not safe for kids. I wouldn’t show it to my kids. I instead download the videos I want and manually make a playlist. There is wayyyyyyy too much garbage in YT kids.
My problem with YT Kids is that the business model of YouTube is inherently incompatible with children and law. Here in Sweden advertising targeted at children is banned[1]. YouTube Kids is more or less all product placement, influencer "reviews" and sponsored content. Why would a child need thousands of videos? I would curate and download a library of videos.
Is it just me to see a bigger problem here - what the fuck is wrong with people to let their tiny innocent offspring just watch some fucking dumb TV?
Is this the best they can do as parents? I mean, we all know just dumb sitting on couch and passively consuming whatever goes in TV ain't healthy mentally (and physically), and they ingrain this approach in their kids since very young age?
Parenting ain't easy, never was and probably never will be, but damn invest some proper human time in it, don't just offload raising of your kids to some stupid website which lives off ads. This is best invested time of your life.
I have a 2-week newborn, and damn he ain't getting close to TV/phone/tablets till much much later, and even that in limited manner.
After 1 she also got increasingly harder to feed. She would throw away the spoon with her hand and not open her mouth. Just a massive rebel in general. We were really stressed out. She was starting to fall off the growth curve. Pediatrician was concerned too.
Then while eating we once showed her ABC kinds wheels on bus go round and round rhyme video, not sure what clicked in her brain. It was a night/day difference. No rebel , eats well.
So we kinda use it as the last resort and feel a bit guilty.
Before you’re a parent you kinda have these conceived notions of how you will parent that are all based on a self-centered view. Which i say in the sense of the word, they are views on how you are going to parent because you think _____ (a you-you combo). Then you have a kid and the reality hits — this is a young person with their own ideas. You have ways that you want to parent that are going to conflict with what they want, so you have to navigate and negotiate.
You used something they wanted as a carrot stick (couldn’t not) to get mutually beneficial behavior, you’re still gonna feel guilt but this seems well done to me.
Or just play PBS kids, which has apps on multiple platforms, good quality, and totally free.
Right now my kids are small. I think PBS kids should be enough for them. When they want to find something else to watch, I know they just had too much screen time and they need to do something else other than watching TV.
You replied to a comment where their primary concern is, quote: "When a scary clown jumps out and screams, that's traumatizing". Not porn, sexual innuendo, or pedophiles looking for potential victims. Not war, or violence, or even brainwashing advertisements. It's clowns... their primary fear about YouTube's content is the possibility their sensitive child will be exposed to a clown.
It's impossible to satisfy everyone, regardless of the content rules, and no mater how curated. YouTube could spend a billion dollars curating content, along with surveys where 90% of parents have to approve a video before it is included, and the remaining 10% would still be the vocal minority ruining it for everyone.
Sure, YouTube probably doesn't do quite enough. But when you have parents looking to let YouTube babysit their 2-8 year old children without supervision, whose fault is that? YouTube's, I'm sure. /s
A scary clown. Like Pennywise. Randomly popping up and loudly screaming in the middle of an educational video about buses. I don't want my kids being exposed to that either, and I don't know any parents that would. The commenter wasn't talking about letting their children watch unsupervised, even supervising your children won't stop this type of thing from doing damage.
I'm intrigued that you think that a video that causes persistent nightmares, being afraid to go to bed and trouble sleeping across several nights can't be described as 'damaging'
For something to be damaging there would have to be some damage somewhere. You wouldn’t describe a bad head cold as dancing and that will absolutely lead to a pissed off two year old with trouble sleeping for several nights.
When I taught kindergarten one of the children in the youngest class at the time was terrified of the Eensy Weensy Spider video and of the Elephant in a Hickory Dickory Dock video. I told her to close her eyes when it was coming up. She did. Eventually she got over it.
The idea that a scary clown can be “damaging” is a testament to helicopter parenting. Up to 1900 it would be a rare person indeed that didn’t have siblings die before they were ten and basically no one would make it b that far without having a friend or family member die. That’s trauma. That’s damaging.
I'd describe it as a learning experience. If you don't get opportunities to grow up, you stay a child forever. Having said that, I wouldn't allow anyone younger than six to watch anything on a screen anywhere.
Look up the cartoon characters getting run over by cars, or the very disturbing rabbit hole of injection videos. Kids at a certain age take what they see and hear as core truth, and when there’s violence at hat level it leaves a lasting mark.
Have you ever read any of the original Grimm’s Fairy Tales? They are gruesome. Cinderella’s step sisters cut off parts of their feet to fit into the slipper kind of gruesome. Children have been dealing with this kind of stuff for a long, long time. Life leaves a lasting mark. If they’re terrified and nothing bad happens they learn they can deal with this. I wouldn’t do that deliberately to a child but lasting mark does not mean lasting harm.
Simply because trauma has kept past generations “safe” by teaching them to avoid any potential harm, does not mean it’s the best option.
Previous generations also used to “instil” a sense of fear in to children by beating them. It does make them listen as they are then ruled by fear-inducing trauma and yet I think we’re all clear on how it’s not the best option for their growth.
The clown was just an example, and it's not just "a clown", it was something designed specifically to frighten young children, where the poster had gamed YouTube's systems to have it show up in the kids section.
And I never let the watch unsupervised. But at least with every other streaming service, I don't need to preview the content ahead of time.
Wow. Who /are/ these people? What is their goal? Are there really people on earth with the exclusive motive to scare a two year old and give her nightmares? That is some next level evil.
The source of these videos could be anything from lone assholes to a state sponsored program to destabilize western civilization. Without tracking down the person who uploaded a given video and hitting them with a sock full of quarters until they explain themselves, it is impossible to know.
YouTube Kids allows you to select which specific channels your kids can watch including selecting from a list of "trusted partner" channels such as PBS Kids. When you 1st install YouTube Kids it leads you through these options.
I think the onus for curating a child's video content rightfully belongs on the one raising the child. How can YouTube (or any content creator) come up with a policy appropriate for everyone? Parents and guardians have different ideas of what their children should and should not watch.
Of course as the parent I am the final arbiter. But I'd like them to take the first swipe. Every other streaming platform for kids does this. Why can't YouTube?
The complaint is they don’t curate kids content. Saying there are very basic tools allowing you to do so is hardly solving the problem. It’s vastly simpler to simply block YouTube for young children which also completely solves the parents problem.
I'm just going to assume you are being sarcastic and making an ironic point about parents wanting to control what their child sees but then not putting any personal effort into it.
The harsh reality is that parents have to do work. I remember when I was a kid I hated that my parents looked through the books I checked out from the library to see if they were "appropriate" and that sometimes they would remove horror novels and other things that they thought were not age appropriate.
But the truth is at least they cared enough to check and at least they were consistent, and later on when I became a teenager they gave me more freedom. If you have opinions about what media your child has access to you have to do the work to curate...
Of course I want to review the content too, which is why I usually watch new stuff with them. But I want the service to take the first go at it. Like every other streaming service for kids.
Same thing at the library. When I get a book from the kids section, I can be reasonably sure that I won't be surprised by some adult content when I sit down and we read the book together.
I’m not arguing that YT is inherently bad for all purposes; just that, in this situation where you want to watch kid’s videos and be confident all the material will be age-appropriate, there are lots of apps that are simpler and better.
Rather than installing YouTube and whitelisting PBS Kids, why not just install the PBS Kids app instead?
But why? Every other streaming service for kids does this for me. YouTube definitely doesn't have anything special enough to warrant any special effort.
Also, they still don't make it easy to lock down to just that content on every platform.
I don't have to worry about this with the PBS Kids app. I know that no matter where it runs it will be safe. My house, grandma's house, whatever.
Same with Netflix Kids. I don't have to worry about it having bad content anywhere my kid figures out how to use it.
It feels strange to me to compare a user-generated content site with a more traditional media platform. And then complain that the former doesn’t work like the latter.
I’ve never really understood how YouTube Kids was even a thing.
There are other things but first thing that comes to mind is tgat Youtube has several orders of magnitude more content than others. Netflix is not a similar service.
Think about the depth of human suffering you’re asking to unlock with this request? Millions of YouTube videos get uploaded everyday, at least hundreds or thousands of those must be YTKids flagged. A white list of channels as offered now is much better until machine learning backed filtering gets a bit better.
You don't need to make a whitelist decision on everything though. Youtube Kids doesn't need to be like Facebook, it doesn't need to moderate everything that's posted. You can be highly exclusionary, you can ignore posts by smaller creators.
It's fine for Youtube Kids to be a much smaller selection of manually curated videos from specific partners, because Youtube Kids isn't replacing Youtube as a whole.
Given that their current solution basically kills ad revenue on those videos anyway, it's not like creators are going to be clamoring to be included on Youtube Kids in the first place. There's no harm in having a subset of Youtube that is highly gatekept, the same way that subsets like Youtube Red were.
Go ahead and let Youtube itself stay as a kid-unfriendly, algorithmically moderated wild west with hidden bombs of objectionable content spread around. That doesn't need to be gotten rid of, just have one corner on a separate subdomain that's nice and safe for kids.
> Go ahead and let Youtube itself stay as a kid-unfriendly, algorithmically moderated wild west with hidden bombs of objectionable content spread around.
That's exactly what the FTC fuss was about, though. It's easy to say that until a parent gives their kid access to it and suddenly YouTube is advertising and recommending stuff to a child, which the law doesn't provide a safe harbor from (not for the content uploaders, either, which is going to cause a lot of problems). The FTC guidance was to identify kids instead by the content they were accessing. Hence Google's complaint: https://youtube-creators.googleblog.com/2019/12/our-comment-... (tl;dr skip to the "Treat adults as adults" section)
You're completely correct. A curated Youtube Kids definitely won't work for the FTC, but it would address jedberg's complaint -- that Youtube Kids is basically worthless as long as it's algorithmically populated with shock/troll videos.
To be blunt, even the FTC's own proposal isn't going to work for the FTC (at least not to the degree they want) -- it's a bad solution with unintended consequences. I've brought up a few problems, but other people have brought up far more of them. At a certain point, beyond explaining the FTC's policy and showing the problems, it's not really worth spending the time trying to come up with a way to make it work. It's better to come up with solutions that would address the concerns of actual parents, like jedberg.
Importantly, the FTC's approach is the opposite of what parents like jedberg need. Parents need a highly curated feed that they can feel safe showing to their kids. The FTC's approach is going to lead to creators erring on the side of labeling too much of their content as kid friendly. It's also going to force Youtube to get more aggressive with its algorithms, which will inaccurate label objectionable content as kid-friendly.
Parents needs fewer videos more carefully classified as kid-friendly, and the FTC is making sure that a lot more videos will be carelessly classified as kid-friendly.
Like I said below, my assumption based on the FTC comments I've read/watched is that they just haven't thought that far ahead. They're mad about Youtube in general, and they see creators as an easy way to get at Youtube. That's maybe an overly-cynical take on it, but I'm not super-inclined to be charitable to them on this one.
You're using a lot of words to say that Youtube doesn't have a viable business model.
They don't. The cost of providing the service they pretend to provide would greatly exceed the income on that service. Even with the most aggressive advertising one could imagine.
what? that's a wild strawman, but I think it's instructive -- I think this is actually a blind spot for a lot of techies. Point blank: I don't care if their business model is viable or not, I care about the service they provide to society. the service they provide (video hosting for free for anyone) is incredibly valuable, almost incalculably so for the amount of creation that it has inspired. the fact that kids can see it is just a side effect of the core value, and so they should provide tools for users to help them protect their children from content they don't want to see. But does that mean that they need to hire an ARMY of humans to do that? No, in fact even if they could afford it, that would be terrible!
> This doesn't solve the number one problem of kids content on YouTube -- they are leaving it up to the creator to decide if it's kids content. It's still not human curated.
You can imagine how Google would automate some nightmares otherwise. Say they look at the age of the people viewing it and if enough people below a certain age view it, they automatically class it as content for children.
Of course ideally something in the middle would be a start, so they could see, what are children watching, let's check that out and see if it is in need of some 18+ filter or not.
Given they must be doing some checks, as in age for logged in accounts and making sure adverts are not something not for children.
>But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.
Yes, as it stands the systems Google have in place are not secure and by that, unsuitable content could find its way to your children when you have done all the right things to prevent that. Equally there is the accountability. If PBS drops a boob at the wrong time when children are watching, it will get dealt with and process in place. If YourTube kids do that, nobody else may know as nobody is checking an individual's stream over a network broadcast. Let alone the same level of accountability and process.
You can't compare Netflix/Amazon with Youtube. The amount of content YT must deal with is multiple magnitudes larger. There is no way humans can handle that much content, that quickly and accurately. They really don't have a choice at that volume. Either machines learn to do it properly or there won't be any filtering at all as humans wouldn't be able to keep up.
Yes we absolutely can. YouTube's business model depending on user generated content is their choice. Not being able to review all the content is entirely a problem for Google to solve. Maybe they should stop optimizing for growth and put up some hurdles to who can upload content? That's for them to figure out.
This seems especially true for YouTube kids to me. Do kids really care whether the content was produced a year or a week ago? It's not a news outlet. We literally have enough content available for each kid's whole childhood. YouTube could decide to review and allow one new channel per month per language and the consumers wouldn't be affected.
I guess I should've been clearer about the age/context here. There's a period when they'll watch some show on repeat a hundred times. Or a generic cartoon - it doesn't matter when it was produced.
Sure, it will start shifting once they want to watch something their peers watch. But at the time parents choose the media, how would they even know it's not the latest production, or some mix of old/new?
Peppa pig started in 2004. At the time kids are interested in it, if you show them ep 2, will they say "this is not from series 6 from 2019"?
They care when growing out of "kids" program and they care for specific series.
Even with a more restrictive approach Google/YouTube could whitelist specific content providers, which could provide fresh content on a frequent schedule.
I never understood this either. Why not go to a whitelisting model for channel owners and just start with large brands like PBS that have a vested interest in their reputation. You get the benefit of new videos from those channels and then slowly work through a backlog.
Now that ads are blocked, what is the revenue model for YTKids videos?
I was about to suggest that content creators pay a nominal fee for the content to be reviewed but if creators have no way to monetize the platform, that proposal is bijective with suggesting that Google just shut down YTKids. Or are parents paying already for access to YTKids and the reviews would come out of that?
The outrage machine would find something new to be upset about. Every week there'd be another channel coming out of the woodwork upset that they could not make the YouTube Kids cut.
Curation is messy, but that doesn't mean we should throw up our hands or, worse, think an algorithm can do it.
The argument here is whether imperfect curation is better than none, and the history of television tells us that it is. We never had a TV channel that randomly showed kids pro-suicide messages.
Not being able to review all the content is entirely a problem for Google to solve. Maybe they should stop optimizing for growth and put up some hurdles to who can upload content?
This sentiment is so shocking that I had to say something.
Please no. Let's not encourage a world where people are even less free to share their ideas.
I get where you're coming from, but "Think of the children!" has been used to justify so many evils throughout history. It's an emotional appeal that should be resisted.
Putting up barriers would also be the first step towards losing dominance. But in the short term, it would suck not being able to share a video without waiting a long time or being approved.
Putting up barriers on one privately owned website isn’t the same as making people less free to share their ideas. Particularly when the barrier might be something so benign as “to be included as kids content this video must be pre-approved”.
But won’t someone think of the poor content creators who might have to wait or upload their content somewhere else!
This doesn't make any sense. There already exist non-user-generated platforms, as mentioned in the previous comments. If you don't like the downsides of the user-generated ones, why on earth would a reasonable reaction be "this shouldn't exist" instead of "this isn't right for me so I won't use it"? What a small, petty way to see the world.
Expecting a service that has "Kids" in the name to have decent kids content is not a dereliction of parenting. It is full of low effort, addicting (to kids) videos.
You wouldn't need to moderate all the content. Just the content that has "safe for kids" stamped on it. If a creator says they are publishing a video "safe for kids" then it gets flagged for review before going live to YT kids. It could go live to general YT prior to that, which could preempt the review process if it was flagged before a moderator even got to it.
I think the issue for here is the YT is trying to claim they are providing a YT kids offering, comparable and competing directly with Netflix, Amazon, PBS, etc - if they want to make this claim and have parents allow their children to view it then they need to back it up.
edit: I am sure there are more checks in place and that the above idea has been considered. My main point is that they need to take responsibility for whatever claims they are making. If they can't back up the claims, then that is fine - just stop making them...
Maybe you can't, as a corporation worth 100s of billions of dollars, put it out there as being "for kids", if they don't want to pay to hire a legion of people to deal with what they can't scale. They don't get to reap the benefits and redistribute the costs.
Use it responsibly? Are you talking about people uploading creepy videos targeted to kids, or like alcohol or something?
I'm certainly not one to say everything should be as safe as possible for kids, but there's a limit to what a parent can reasonably be expected to understand. If I set my kid up watching a blues clues clip, and look away for 30 seconds and he's seeing a clown with its arm torn off which autoplayed, I feel like I'm reasonably kinda entitled to go hey wtf google, your shit is broken and its harmed my kid.
And then, of course, not let my kid use youtube anymore.
It sounds like you are ignorant of the problem here.
YouTube throws up videos to children that look good until you get to the second of the video some psychopath has spliced into it. It’s literally not possible to safely put your kid in front of YouTube unless you have pre watched every video YouTube might potentially auto services up to your 3yo
The problem with excessive regulation is that complying with it can be extremely complicated and time consuming. If you don't have the money to pay for that, you either risk being fined or sued, you don't play.
It sets a bar that only already wealthy people can get over.
I'm gonna strawman here real quick with a metal observation:
HN: parents need to stop being helicopter parents and kids need to have independence to grow in order to be functional adults, otherwise you're a shitty parent
Also HN: parents need to know everything their child does and know what they are watching at all times, otherwise you're a shitty parent
YouTube doesn’t have a god given right to the toddler market. They are the ones implying (through the existence of YT Kids and such) that they have done the filtering!
If they are honest about there lack of actually effective triage then this whole controversy is of a very different valence.
Stop pretending you’re triaging well if you’re not!
sure you can. Anything uploaded to YTKids goes through 1 of 2 processes.
1. it's from a trusted provider, send it on through. Add a complaint mechanism that makes it fairly easy for a trusted provider to be removed from this list (DMCA for content, if you will).
2. it sits in a queue until a human looks at it. Prioritize videos from those who were recently removed from the trusted provider list as a way to dampen the harm from abusive reports.
Do it by channel. Either a channel is approved for children content by a human or it isn't. Then just rereview periodically. There's much less churn in channels compared to the content.
That wouldn't be good enough. The freaks who like to game the system with scary shit would just game that system too. They wouldn't care that they burned their channel.
The only way this works is to have a human review every piece of content before it goes live, like every other streaming service with a kids section does.
Why not just whitelist channels and disallow ones with less than say 500,000 subscribers. Surely nobody with that large of a following who produces kid targeted content would burn their own channel down just for the "lolz".
But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.
I don’t have kids, but this seems a bit over the top. Children live in the world with the rest of us, and are exposed to things every day that haven’t been reviewed and approved as child safe. Angry drivers on the road may flip you off or yell at you while your kid is in the car, or someone at a stop light may be blaring music that you find to be offensive and not approved for consumption by children. Both of these things would objectively be scarier than a clown on a computer screen, yet kids see and hear these things every day and somehow most of them don’t grow up to be serial killers. Sure, try your best to shield them from harmful content, but completely shutting down something that is 98% clean, educational, child friendly content seems like throwing the baby out with the bath water.
You clearly have never seen the type of content psychopaths splice into child targeted YouTube videos or you would not be making this argument.
We are not talking about some mildly offensive or violent behaviour. We are talking about things that would be illegal to show on prime time TV (child and adult timeslots)
10 great child videos does not make up for 1 psychopathic murder scene spliced into video 11
OP made no mention of “psychopathic murder scenes”. He referred to clowns coming out unexpectedly, which seems in poor taste on the part of the video creator but is a far cry from a “psychopathic murder scene”. I am quite certain that if such a scene were spliced into any video I watched, even as an adult, I would never watch YouTube again, and I would make some mention of it on social media. We live in a culture where even the slightest faux pas sparks widespread outrage across social media, yet I have never heard of this being a significant issue. So it seems like this is not a common occurrence.
Have you ever personally witnessed a “psychopathic murder scene” being spliced into a YouTube video? Do 10% of videos really have this in them? I have never seen it once, in well over a decade of relatively heavy YouTube use. Most of my friends are in the tech scene and have children, and none of them have ever mentioned this as an issue either.
Maybe it's time for piracy and people power to take over. Parents curate for other parents, perhaps parents that know each other. Use youtube-dl and other things. Have an open source app (You might have to get away from iOS for this) on a tablet that can watch this community curated content.
wrt to piracy: keep in mind that parents are often relatively financially disadvantaged as well as being the most financially targeted group. Some parents turn to youtube rather than (say) disney+ for a reason. Those that can afford it do because it is less mind time and real time than having to do even community curation.
How old are your kids? I watch my 8 and 11 year olds on YT Kids fairly routinely, and while lots of the content is kinda junky and of questionable quality (so many toy unboxings and eat-a-disgusting-thing videos...) I haven't personally seen anything I'd call "inappropriate".
My complaint is the opposite: lots of good, sensible content (most recently I was trying to set up my youngest to watch drum cover videos) just isn't there because no one thought to put it there.
I'm not the person you're replying to, but my daughter was able to use my wifes iPad when she was turned 1. I decided we should 'baby proof' it, remove YouTube (along with a few other apps), and add YouTube Kids. In many ways YouTube kids was worse than YouTube, since YouTube would have shown our suggestions. YouTube Kids is full of addicting, low effort content. Think toy unboxing, surprise eggs, videos with colors. Nothing thought provoking or educational. They're mindbogglingly boring, but they turn kids into zombies. And they get outraged if you take it away.
You have to whitelist a few channels, then it works ok. Though my daughter is only 3 now, and would be happy watching Blippi all day. Seems like it would be more complicated to whitelist channels as kids get older.
OK, that's just not true. There's a ton of great stuff on YT Kids (though as I mentioned not enough). But yes, you have to find it for them and show it to them yourself. Left to their own devices most kids aren't going to make good choices. They're kids, that's the point.
This is just the "be a parent" thing, really. There's absolutely no way to get around the need to educate your own children. Certainly Google isn't going to do it for you. And Google certainly didn't invent junky content; I remember staring at Hanna-Barbera garbage constantly as a child.
I didn't mean to say that there was nothing thought provoking or educational, just that YouTube wasn't showing it. It shows high view count, mindless, addictive videos. Even if you start with a good one.
The funny thing is, I had to choose my child's age range (or enter the birth year or something) when setting up the app. Then, when I went to whitelist channel by channel, it had a bunch of recommended channels full of good/decent content. That should have been the default whitelist out of the box.
It isn't just about "being a parent". My daughter is very smart for her age. My wife used to be a teacher, but since we had kids, she stays at home with them. She has taught my kids a lot. She has learned a good amount from content on YouTube, Netflix, and Disney. I think it's good that she can use an iPad better than my parents (even though she can't read). She also can solve puzzle games on her own. Of course we guide her and limit her usage. My point is that an app marketed to kids should not be full of garbage content.
Well, according to the FTC and Google, adding the video to YouTube Kids means never getting your video recommended or added to playlists, no comments or subscriptions, and no ad revenue. Who would want that?
Dare I ask why you put your guards up against YouTube instead of either deciding that your kids are too young to use a tablet/smartphone/PC in general, or that sometimes contents is scary but it's OK to talk about?
Ps: I'm not trying to change your mind, I'm trying to understand the reasoning behind it, coming from a more liberal raising and as a result a more relaxed attitude to what is banned and what is not banned.
You clearly haven't seen some of the horrifying stuff that slips past the filters. We're not talking about the episode of Leave it to Beaver where Wally smokes a joint. There is a category of malicious amateur cartoons that look kid friendly in the thumbnail and in the first few minutes, and then become sick. The Elsagate stuff was part of that. I have seen, on YouTube Kids: an ice cream truck driven by a cartoon animal, who poisons and kills the other cartoon
animals; and a knock off Mickey Mouse cartoon where Mickey rips his own head off unexpectedly with full blood and gore. This is not "have a discussion with your kids" territory, it's nightmares and trauma. As a parent, the risk of that happening obliterates the value proposition of YouTube for kids.
> This is not "have a discussion with your kids" territory, it's nightmares and trauma.
The big tech companies are so far removed from real users that they'll just tighten a few screws on their machine learning and call it fixed. You're a roundoff error to them, unfortunately. These people are the real machines, servicing the electronic machines to make money for their masters, the execs and stockholders.
I have seen it, yes. The sudden gore is obviously unfortunate for a toddler to stumble across, even for an older child. I understand that you would like to shield your children for such content, but I also strongly believe that it's possible to do so with communication, rules and strict guidelines on how to look up content and suggested videos.
"As a parent" is one of my favourite phrases, by the way. It translates to "I know better than you, regardless of your parental status". Which is fine. I was only wondering about the reasoning, after all. :)
> "As a parent" is one of my favourite phrases, by the way. It translates to "I know better than you, regardless of your parental status".
I sympathize with this. Lots of parents are vociferous morons, because lots of people are morons and becoming parents does not fix that. But: There is a real difference, at least for many people. Your brain rewires itself to be hyper-vigilant about child safety and child harm. I can't watch TV or movies anymore that involve harm or the threat of grave harm to children, but these wouldn't have bothered me before. When I enter a room, I subconsciously scan it like Jason Bourne if he were a toddler's bodyguard. Slippery surfaces? Sharp table corners? Unfenced stairs? I've compared notes with friends and this is a common, if not universal, change in parents. They spend a lot of time thinking about child safety.
I think it's pretty fair to expect that a product or platform like YouTube delivers a consistent and predictable experience instead of blaming users.
Yes, if you are worried, the smart thing to do is not to expose your kids to YouTube at all. Wishing that YouTube could be better is independent of whether you choose to expose your kids to it.
Also, it has nothing to do with being liberal or conservative, at least with me. I will let my kids watch a lot of stuff, if we intentionally pick it out. If you saw random baby shark or finger family between whatever videos you wanted, you'd be right to complain and no one would blame you.
There is zero product design behind the way kids are exposed to scary things. No product manager decided it should be that way. It just is that way because it's not prioritized by the business or it's too expensive.
Imo if YouTube markets at kids they should be accountable. They should clean it up.
> Dare I ask why you put your guards up against YouTube instead of either deciding that your kids are too young to use a tablet/smartphone/PC in general, or that sometimes contents is scary but it's OK to talk about?
My kids are too young to have unrestricted access to the internet. That's why they are limited to specific kids apps.
But YTKids is different than all the other kids apps. All the other apps have a human check all the content before it goes live.
So I can be reasonably certain that the content will be ok without watching it all first.
With YT, I can't have that certainty. So they just don't get to use that platform anymore.
Yeah, bad things happen, and when they do I talk to my kids about it.
But I don't want to purposely subject them to such things if I don't have to.
Dude the stuff we're discussing freaks me out just looking at it, and I'm a grown adult. If I'm watching a horror/gore movie I'm fully prepared for everything thats coming, if this kind of thing randomly appeared while watching a video about how to make spaghetti and meatballs with no acknowledgement (as it tends to happen in these kids videos) It'd fuck with me big time.
Not the OP, but we tried that with my daughter. It turns out that’s only a viable strategy if you’re willing to absolutely never use a phone or tablet or computer or tv in their presence, ever.
Why is that? My daughter has an extremely restricted access to screens. I code for a living. She understands that is my work. She understands we consider she is too young to decide the contents she watches. And even when she doesn't understand, she understands it's not her call.
You have a very understanding kid then, I'm jealous.
We let ours decide what to watch / play out of a handful of things we've hand-picked as OK (such as Daniel Tiger's Neighborhood or Super Simple Songs), and iOS's guided access mode has been a godsend for this.
Super Simple Songs is on YouTube and has great content for kids, but I refuse to use YouTube Kids as we tried it once and couldn't control it well enough—weird videos started creeping in almost immediately. So, we use regular YouTube and disable autoplay, which we have to manually check every single time, because YouTube has randomly re-enabled it for us a couple times which led to the weird videos again).
We do limit her screen time (and consequently, ours), but trying to keep her away from them 100% of the time? That's honestly not realistic, nor fair.
This.
I was also not allowed to watch certain TV shows if they happened to air at a time where I wasn't put to bed yet. And don't get me started on magazines in the supermarket. I didn't understand why not, but I knew that disrespecting the rule was not worth getting my curiosity satisfied.
You were still allowed to watch TV shows then, I presume? Just like pretty much everyone I know? Our daughter is too, and we absolutely control what she can and can not watch. But in HN terms, YouTube is a massive foot-gun compared to other services. Auto-play very quickly leads you to trouble, and you can never quite trust YouTube to not re-enable it, or show thumbnails for inappropriate content at the end of a video.
At our daughter's previous day home, the provider started letting her watch a YouTube channel called babybus (against our previous agreement of no TV, which was the straw that caused us to quit her service). At first glance the show is innocent enough, it's just a bunch of cartoon characters singing and going on adventures. But if you actually pay close attention to some (not all) of the episodes, its a weird fear-based morality show from China that threatens death if you don't behave a certain (rather authoritarian) way.
The frustrating part about it is that there is a ton of really great kids' programming on YouTube that simply wouldn't be possible without it. It just takes an order of magnitude more effort to sort it out than pretty much any other service that provides content because you have to curate everything yourself.
Honest Question: If you did use a device and disallowed the same for your daughter, did your daughter "freak out" in some manner?
I ask out of curiosity and personal learning, not out of any sort of judgement. I do not have children and if I did would likely not have had so in time for this to occur.
It's not quite as simple as you've laid out, but sometimes it can be, depending on everyone's moods. Sometimes she's perfectly fine playing in her play kitchen while we watch a 20 minute show, other times she wants to watch a nursery rhyme and sing and dance along, or watch Lucas the Spider while she cuddles her stuffed version of him.
You don't realize just how much free time you really have until a toddler demands it all away from you. Meanwhile, dinner needs to get cooked, dishes need to be done, laundry has to be washed and folded, the house needs to get cleaned, etc. And your kid wants / needs attention every few minutes (at least ours does—she can only colour or paint or play with toys for so long by herself before she wants to play with you).
I don't understand that thing with the scary clown jumping and screaming, or any other shocking content like that. What do they earn with those videos? Or are they just psychopaths?
We're dealing with two year olds here, you know, the type that doesn't even have the ability to run away from danger even if they're able to recognize it, and even if they do they wouldn't be able to do much about it because, again, they're small children.
Unless you are able to watch them every waking hour, this is simply not possible. At some point you have to cook, clean and generally do all the stuff that needs doing without worrying too much about your child hurting itself.
What generally happens is you set rules and teach what is acceptable or not. When you are not around the kid is expected to follow the rules.
Many kids do not follow these rules. They get punished and have valued things taken away. Some learn a lesson and will follow the rules and others do not.
The ones who still don't listen have to be with you when you cook and clean. If you go out another parent/babysitter/old children will watch them.
Unless you are able to watch them every waking hour, there's nothing preventing your child from clicking on a video which is NOT labeled child-friendly.
It is not Youtube's responsibility to raise your children.
> I used to let my kid watch YTKids while I supervised.
I think this is where you lose me - why is allowing a child access to have un-tethered access to a content portal appropriate? i.e. why is the responsibility on the content portal (YouTube) and not the parent or supervisor?
I might be a tad too much on the individual responsibility side of the spectrum here, but just like the old argument from the 90's ("The TV ain't a replacement for a babysitter...") I'm not sure the responsibility necessarily needs to be on the platform here over the parent/supervisor.
One suggestion: You can create a white-list or playlist of pre-approved content creators and/or videos, and offer access that way.
It’s easier and more fool proof to just not use it. That’s the point - their service isn’t working and it’s better for many parents to just tune it out.
That’s parenting - avoid the shitty services that can’t put out a good product so you can focus your effort on something more meaningful. Or YouTube can fix the problem and it will be allowed in my household.
We generally trust that every page of a kids book will be roughly as expected given the title and author of the book. We trust the same of things like PBS kids programs.
>why is the responsibility on the content portal (YouTube) and not the parent or supervisor?
because given how easy it is to access content on the user side there's no other way to do it feasibly. You can take the tv remote away from your kids, and that only turns the TV off. Tablets, pcs and so on are general-purpose computers with plenty of workarounds, it's extremely hard practically to stop kids from accessing content that they can technically access without greatly diminishing the value of the device or service.
The nice thing about Netflix/Hulu/Amazon/PBS/etc. is that a human decides if the content is for kids.
YouTube is still trying to get away with "letting the machines do it", which works for most aspects of their business, but not this.
I used to let my kid watch YTKids while I supervised. But no more. Even with my supervision it just jumps to some insanely inappropriate video, too quick for me to shut it off in time, and it's already done damage.
When a scary clown jumps out and screams, that's traumatizing, even just for a moment.
I have no trouble letting them watch all those other platforms. Heck, I let them watch PBS unsupervised.
But YouTube Kids will never play in my house again until I know that every piece of content on there has been reviewed by a human before it goes live.