Toxic Cooking Show
Misogyny, $800 first dates, simps, and high-value women: Social media has been busy cooking up and feeding us an addictive but toxic slurry of trends over the past few years. Here at The Toxic Cooking Show we're two friends dedicated to breaking down these trends, terms, and taunts into their simplest ingredients to understand where they came from and how they affect our lives. Join us each week as we ponder and discuss charged topics like personal responsibility and "not all men" before placing them on our magical Scale O' ToxicityAny comments or topics you want to hear about write to us at toxic@awesomelifeskills.com
Toxic Cooking Show
Behind the Lens: The Realities of Child Influencers
Can sharing your child's life online do more harm than good? In this thought-provoking episode, we tackle the world of momfluencers and child influencers, diving into the concept of "sharenting" and its ripple effects on children's lives. Sharing family moments on social media may seem harmless, but we explore real-life cases where it has led to bullying and embarrassment. Our conversation tracks the shift from early social media platforms like Facebook and YouTube to the newer realms of Instagram and TikTok, shedding light on the growing phenomenon of parental oversharing and the urgent need for more mindful practices.
As we navigate this digital labyrinth, we raise alarm bells about child privacy and the looming threat of online exploitation. Parents often face a tricky balancing act between sharing and safeguarding, yet the stakes involve more than just social media fame. By sharing stories of families grappling with the consequences of exposing their children's lives online, we highlight the vulnerabilities these children face, from potential exploitation to the ethical quagmire their parents tread. The discussion doesn't shy away from examining the uncomfortable responsibilities parents shoulder in this brave new digital world.
The narrative takes a poignant turn as we confront the unsettling trend of young influencers posting provocative content. We scrutinize the moral implications and the roles of both parents and tech companies in perpetuating this cycle. Frustrations bubble over the lack of actionable measures against inappropriate content, but there's a glimmer of hope as we spotlight recent legislative efforts to protect these young talents and their earnings. Through it all, we issue a clarion call for awareness and proactive measures, urging listeners to rethink how we navigate the digital landscape with our youth at the forefront.
Hi and welcome to the Toxic Cooking Show, where we break down toxic people into their simplest ingredients. I'm your host for this week, lindsay McLean, and with me is my fantastic co-host.
Speaker 2:Christopher Patchett, LCSW.
Speaker 1:And our EP.
Speaker 2:Little Miss Molly.
Speaker 1:She's snoozing, hopefully.
Speaker 2:After being dragged away from the bed.
Speaker 1:It's hard. It's hard when you're a little dog. It's hard, it's hard when you're a little dog. So have you noticed we always start our episodes like that. I've been trying not to, but I couldn't think of a better way for this one.
Speaker 2:I did the same exact thing for last episode. Yeah, I was like so.
Speaker 1:Tried to think of something else. I was like no, it's got a another episode, I'll do something different. So we've talked about the, the history of the mama sphere and mom fluencers you learned about them and family vloggers. I don't know if I like fully explain that. The first family vloggers are kind of included in that they're just not necessarily women, but they're parents who are showing off their family and family like to be a family vlogger. There are two-legged children. I don't think I've ever seen anyone call themselves a family vlogger. They'd be like hi, it's me and my six dogs. You could, theoretically, there's nothing stopping you. I think you're just maybe going for the wrong crowd there. So it's a two-legged variety.
Speaker 1:Then we talked about the super whack jobs, like the real crazy of the crazy, the ones who have been arrested for whatever nefarious deeds they were doing. Like you know, we went. We went all the way there. But now I want to kind of back up a little bit not too much, just a little and talk to you about some of the more normal things and normal people that you're going to see within this sphere and just how terrible they also are. These are the ones who they're momfluencers and they're managing their childfluencers? I think that one's childinfluencer still, but they're momfluencers and they're managing their childfluencers, I think that one's childinfluencer still, but it's momfluencer.
Speaker 1:Fucking English language. And most of these people that I'll talk about here have a decent following. Certainly they've ended up on my personal radar, which is why I decided to do this, because I saw one too many posts where I was like, oh no, so they're big enough that, like, they know what they're doing, they know the risks. And what are some of those risks? I hear you asking yourself and your melodious voice over there. We can start off with like just plain old oversharing, which apparently has a name at this point it's called sharenting. It's when parents share everything okay.
Speaker 2:So I know from past episodes we we kind of talked about this where, like, especially with little kids, and oh, bobby pooped in the bathtub today and it's like maybe let's not put that out there.
Speaker 1:Yeah, we talked about that in terms of like a almost hypothetical. It's like this is being put out there. This is going to affect kids. It already is. You and I just had this conversation about feeling old, because there were kids born in like the mid-2000s who were like I'm a teenager. Now there are people whose parents are posting this stuff and it has called up to them in school. I mean, we're talking about you know, imagine knowing the date of your first period because your mom posted it on facebook for the world to see and you are now an adult, you are in your 20s, and this information is out there because mommy, dearest, felt the need to share that and now everybody knows, including you.
Speaker 2:I'm so happy that my mom never posted about my first period.
Speaker 1:That was very nice of her. I'm glad she didn't either. You have that. This same person from the article also spoke about how she had a staph infection and apparently mom posted about that too and this got back like to her high school classmates, who of course, made fun of her for it. Like this is already happening. The parents who overshare their kids are not small babies. Their kids are in high school, their kids are inare. Their kids are not small babies. Their kids are in high school, their kids are in college. Their kids are legal adults. And this is happening already that we are oversharing to the point that, like people are being bullied, people are facing that, and this was years ago that she cause she's 25 now, so you imagine, even 10 years ago we were at the point where, like parents were oversharing and that was getting spread around and you're you know your high school teacher knows that you had a staph infection okay, I'm trying to think back, kind of doing the timeline of instagram.
Speaker 1:So instagram, well I I guess these would be posted on youtube more these would be on facebook for for some of this, so it's not even like these are parents who are like super big influencers, but this was before instagram and tiktok. This would have been like facebook kind of youtube area, blog area area area era when people were sharing. I know this specific person spoke about the fact that their mother did get like concert tickets and some other stuff and she wasn't just posting for funsies. This was a bit of a thing that she did to get stuff, that she just shared everything about their lives, including this type of like very, very personal information and, again, based off of the age, I would probably say that this was like Facebook Facebook to YouTube.
Speaker 2:Only, who knows that we could have been Cause. Yeah, if she's, if she's 25, then this would have been like maybe 2010, 2000. Yeah, 2000.
Speaker 1:Yeah, yeah, and that would have been.
Speaker 2:That was before instagram, yeah instagram was yeah, I think it started in 2011 and I associate with.
Speaker 1:Like 2011, 2012, 13, 14 was like the big post photos of things, but people weren't influencers yet. On there it was still. It was like when facebook used to be where it's just like we're all friends, but facebook had already started turning to like sharing stuff, just post it all.
Speaker 1:So, yeah, that's that's happening, that kids are getting blackmailed, they're having racy photos of themselves sent to their schools, either photos that they took or photos that we may or may not get into in this episode, photos their parents may have taken of them in tight or revealing outfits, and these are have been stolen or acquired and they're being sent to the school. I mean, like there's a photo of this kid, here you go and like then the kid gets in trouble Because why is this photo of you like making the rounds on the internet, like it's the child's fault?
Speaker 2:Yeah, that's. I'm so happy that this wasn't a thing when I was growing up. I know.
Speaker 1:I mean, my parents wouldn't have. You know, my parents they understand that boundary, but I'm just really glad that it's not. All of my friends, the ones who have kids, are very careful about what goes online about their children and I appreciate that. I appreciate that I'm not getting poop stories from them and I appreciate that they're taking care of their child's privacy Like we. Just, we love this for your kid. I want this for your kid and for your family. Yes, yes, yes. So you might ask like why do people keep doing this? Like what are they getting out of it?
Speaker 2:Money.
Speaker 1:Bingo. I know that was a hard question, but you get a ton of money, and the way you get all this money is the almighty, all-seeing algorithm, and that's problem number two. So, to clarify what exactly an algorithm is, because we just kind of talk about, oh, the algorithm, the algorithm. What is it, though?
Speaker 2:because I think some people don't actually know you so, like I, think, having done this podcast and everything like that, yeah, trying to to understand what the algorithm. It's funny because it seems like the more you learn about the algorithm, the less you know about the algorithm yeah, because then you know technically how it works, but still like what?
Speaker 1:so, in broadest terms, this is the set of instructions for what content you were to see. So each site has its own algorithm. Facebook has one, instagram has one. Yes, they are owned by the same company. They may still have different ones, we don't know. They don't publish all of this information, which is part of the problem is that they don't want it out there. Their algorithm is their own.
Speaker 1:This is why TikTok got insanely popular, because the TikTok algorithm was very good at picking up what people might want to see. So, in general, the algorithm is looking at user history, location, your profile, the popularity of the content, your current activity, your friend's activity. So like what? All of this like kind of goes in together and gets turned into a little equation and it comes out the end. It's like you want to see this cat video, we think because of you know your friend watched it or you've watched it in the past. And it's also looking at what people are interacting with, and this is very key. Interacting means you watch it, comment, send it to a friend, you save it. All of that counts as interacting.
Speaker 1:You can try this. You can go on Instagram right now, go save a couple of cooking videos, even though it's private and like I won't ever see that. You have saved those. Now you're going to see more cooking videos and as you watch more, I am more likely to see cooking videos because you are interacting with them and you and I are friends, and so the algorithm is going to pick up on that and I may start to get a little bit of that Now. Obviously it's very nebulous to how much, but that does actually have an effect. The more you watch of it, and because we are friends on there, the more it shows, the more you interact with it. And people forget this that when you're commenting on videos, that video is still playing in the background, racking up views. That tells Instagram that this is an exciting video. Instagram, facebook whatever they're, all the same in this case that tells, whichever site it is, that this is a really great video and that more people want to see it, even if you're there in the comment picking a fight.
Speaker 2:All publicity is good publicity and this is the thing like, and and I I think I I've talked about this, uh, in a previous episode where, you know, somebody had said something like outrageously, like harsh about like you know, like you know, during the elections and everything like that, and I wanted to comment. And as soon as I brought up the comment on it, uh, I saw, like you know, somebody had said same thing that I was going to say, and then underneath it, the person who posted said something along the lines of yeah, that's great, you just posted on my thing and you're just driving my video to be seen more. Yeah, no, thank you. So, like you know, like that.
Speaker 1:And then so, yeah, it's picking up on exactly that, exactly that like, just any comment is a good comment. So even if people are just leaving emojis or stuff like that, like that pushes that to a bigger audience. It tells the algorithm yes, keep sending. And that also means that you don't necessarily have control over where your content goes. You can try to, obviously, if you have like ads and stuff running, you can try and target certain groups, like people who like dance, people who are, you know, into volunteering. You can try and target that specifically. But even that like it's not looking at what people have reported.
Speaker 1:When you sign up for Instagram, you don't say here are my interests and that's it.
Speaker 1:It is basing it off of what you're actually consuming.
Speaker 1:So if you're consuming a lot of dance videos, it's going to put you into that category of is interested in dance, which means that then when somebody runs an ad for dancewear shoes, tutus, leotards, whatever you are more likely to see that ad, no matter who you are, which is why little girls' clothing and jewelry pages are very popular with adult men, and that is a proven fact. Unfortunately, the New York times I hope you're ready for this New York times did a test. Uh, cause somebody had contacted them about it and they saw the ads that she had run. This was jewelry for like five-year-olds, this is not adult jewelry, this is child jewelry, for children, and so in the ads, some of the ads had, like, the actual child shown wearing the bracelet or whatever it was. Others was just you know the bracelet, like oh, buy this now, type thing, and I quote the ads got direct responses from dozens of Instagram users, including phone calls from two accused sex offenders, offers to pay the child for sexual acts and professions of love yeah and then meta.
Speaker 1:When new york times contacted about this, meta was like I don't think that was a very good study. I think you like rigged it. I mean two accused sex offenders tried to call this child or the person who is managing the account for the child.
Speaker 2:Yeah.
Speaker 1:People were offering to pay a five-year-old for even the professions of love. That's gross. You're an adult man. You're an adult anyone, I mean you know.
Speaker 2:And then, oh God, like even to say what's up to a five-year-old yeah, absolutely not I, I, you know, I, I couldn't even imagine like, even even hannah like no, you, just you don't you recognize.
Speaker 1:Even I as a woman. I'm not going to even before I did the research for this if I saw a page of, like a young girl who was posting sports or anything like that. I am very careful about what I like or comment on. I don't comment on basically anything, because I find that mildly inappropriate that I, as an adult, am watching you do gymnastics that your parent has clearly put you in. It's clearly pushing and you may be wearing a slightly revealing outfit Gymnastics a little bit more. You're like okay, you're wearing the outfit because this is what you wear, little kid in a bathing suit. You're like nope, nope, nope, I'm not touching that. I'm not touching that with a 10-foot pole. Okay, how cute you are. Like I don't want to be anywhere near this photo.
Speaker 2:Like I'm literally I'm trying to think in my mind if there would be an appropriate time, like, even like a parent taking a video and saying, like you know, like something in the background. But no, I can't even think of like you know, she really isn't.
Speaker 1:Yeah, like you know, like something in the background but no, I I can't even think of. Like you know, yeah, you just like you'll need to be commenting on this stuff, especially like kids that you have no idea who they are and they just like breeze through your life in this video and imagine being like, hey, cutie, like oh, that's so sweet. Look look at your life and look at your choices.
Speaker 2:You were an adult commenting on the video of a kid I think the only only possible way I would ever even like that would be is if it's my friend's kid exactly.
Speaker 1:Yeah, if it's my friend's kid, if it's something like facebook, you know something like that then yes, I may like the phone and be like, oh my god, they're so cute. But again, like most of my friends, do not post photos of their children online. I have friends who have actually said, like I don't send photos to people of my kids like you, just you won't ever see. Like I have one who she has never sent me a photo of her son. When I see her, she has tons of photos on her phone and so she'll go oh look, we were doing this, we were doing that. She just doesn't post them because she doesn't want them to potentially be like to get out and they're not incriminating. It's not like these are photos of him. You know, it's a little four-year-old running around butt naked.
Speaker 2:It's just a child and, and you know, one of the things I did here and I don't know if you're gonna bring this up, but is that sex predators are going to do things like oh, here, here's Samantha in in her cheerleading outfit, and the cheerleading outfit has, like you know, the Shamney Indians, you know logo, and now that person's able to say, oh, she goes to the Shamney high school.
Speaker 1:Way to jump ahead of me. Excuse you, I was going to talk about that. Fine, we can talk about it now. No, it's absolutely true. Is that these kids? Even if they are just promoting something as simple as dancewear? Or here's me in my latest cheerleading outfit.
Speaker 1:You're now putting this on the internet. People see it for that. Who knows who these people are? They now start to follow your account.
Speaker 1:You can trace back with shocking ease, more than a lot of people realize. You can figure out where somebody lives. Yet with shocking ease you can find out a lot of information about people, and there is one couple in particular that they've been called out by their fans. They are a lesbian couple in the UK, caitlin and Leah. They have two children who they have said the kids' names on the internet. They don't show the kids' faces but they'll post photos, like real-time photos of them at a pumpkin patch and the kid from behind the one who's mobile and I was like you, could you and people know where you live?
Speaker 1:Like the town, someone who was really gross could absolutely just be in town, could be watching your stories, sees this story of you and your wife and your kids at the pumpkin patch, goes there and then you know, finds your kid and is like, hey, little so-and-so, you know, mama, told me because they know that you use these names like mama and mommy or whatever it is told me to blah, blah, blah. And they know all about the kid because every bit of information has been shared. And then how does the kid know that this is not appropriate? You know my mom, you know my other mom, you know who I am and you know this stuff about me. You must be okay, right? How would you know all this? And even if you don't even go that far, what's to say that somebody doesn't just go there and take a picture of your kid? And now they have a photo of the kid's face?
Speaker 2:Oh God.
Speaker 1:Yep, yep, yep, yep, yep. So keep in mind too, for do sports like dance, gymnastics, swimming, where they're in revealing outfits, kids who just are taking photos in bathing suits, let alone the ones where the kids have been adultified and are wearing makeup, grown-up clothing, that type of thing. These parents know what they're doing at this point in time. These parents know and I read some really horrifying articles about parents talking about this and they knew. They fully admitted. They're like yeah, every day I wake up and I delete followers off of my six-year-old daughter's Instagram account, and every evening for bed, I delete followers off of her account because they're gross pedophiles, but I keep the account. The account is still there and I keep posting photos of it despite this.
Speaker 2:I saw something along those lines as well. I forget exactly who, but this was maybe about a year or two ago. But I saw something and I actually was going to comment on her page like hey, it's probably not a good idea, but other people had commented the same thing.
Speaker 1:Yeah, they don't care. Yeah, again, caitlin and Leah have been called out. You can scroll through their comments and people are like this is not safe what you're doing. There's another couple, julie and Camilla, again a lesbian couple. I think these two tend to get a lot of call out because they've both been, not because they're lesbians, but both because they're very vocal about privacy and caring about others and empathy and this type of thing and social justice. And so then to have that side of stuff where you're like, yes, this is all good, and then they turn around like I'm exploiting my child for money, like, oh, okay, I mean Julie and Camilla, they have a son and they have not shown his face online and they have not mentioned his real name. They use a nickname for him.
Speaker 1:They were at one point alerted to the fact that he was potentially like. Photos of their kids were making their way around the dark web and their only response to that was like, oh, we're looking into it, thanks. And then they just started blocking everyone who mentioned it. Julie and Camilla, look up, look up, camilla, lore, julie had a whole bunch more Cause. Julie is the one who actually was like pregnant with the kid and she had a hissy fit and hid all of her stuff on Instagram for six months.
Speaker 1:They're problematic for so many reasons, but this is just one of them. They don't always do a great job of covering his face when they take these photos and maybe a year ago, somebody had alerted them to this and people were like, oh, you need alerted them to this. And people were like, ooh, you need to be really careful. And they were like, thanks for looking into it. And they've continued to post photos of people like yo, your kid's feet are just like hanging out, and like you see so much of him, please be careful. And they just they, just they. Delete comments.
Speaker 1:Yeah, I see Like yeah, exactly, it's like that's. You see a lot of his face.
Speaker 2:And great. Now he's going to be on my fucking algorithm too.
Speaker 1:I know sorry.
Speaker 2:Son of a bitch.
Speaker 1:Well, and that you know combined with like videos and stuff like that. Again, like we just said, people know where they live. Ooh.
Speaker 2:Ugh.
Speaker 1:But yeah, look, you've got to make that money, and so that's why these people are willing to sit there and they're like, yeah, I have to delete followers constantly because they're nasty, nasty men, but like the money, because keep in mind, at the beginning of this year, you could the statistics I found you could be getting around $18 per 1000 views on YouTube, and that's if you have a bigger channel, it could be more. Plus, then you've got sponsorships. You can run ads. You could be making way more than that if you have any sort of decent following. That's YouTube. Instagram doesn't pay, but like Instagram, you know you can have ads, you can have sponsorships all of that that will be running too, that you can be making so much money off of.
Speaker 1:Which is why you don't want to be deleting these people. Necessarily thing, you want these people interacting, because all those comments of the heart hey, sexy, I'd fuck that. You know why don't you? You know, give us a little tease and lift up your skirt. These are these are real comments that I saw while researching this that were left by men and nobody had deleted by the time the screenshots were taken. Like they, they had managed to stay there for long enough that it was just building up there, because that is interaction and that interaction drives the channel and when your channel is moving along like that and you're getting all this interaction, it's going to be pushed to more people and you can make more money off of your child who you were exploiting gross, gross.
Speaker 1:Yep, that I mean that that's literally the only thing I can say is just gross yeah, back in 2020, meta found that there were 500 000 child instagram accounts that had quote inappropriate interactions each day. Half a million kid Instagram, not just Instagram accounts in general because, look, if adults want to do that, that's fine. Y'all be gross to each other, but not on the kids account.
Speaker 2:Yeah, no.
Speaker 1:Yeah, half a million every single day in 2020.
Speaker 1:And yet people are still out there posting this stuff because you can get so much money from it.
Speaker 1:You can be making hundreds of thousands of dollars by running these type of accounts, and even if you're not making upfront money, a lot of these are getting things, especially the moms who are running accounts for their daughters who do stuff like cheer or beauty pageants or dance cheer or beauty pageants or dance Because when you go to these competitions, it's quite expensive to pay to travel to the competition, to stay in the hotel, you have to have all these different outfits that you're wearing, and just the outfits alone can get really, really expensive.
Speaker 1:And so if you can find a way to be gifted or sponsored by a company that can go a long way towards helping your kid do this, this, whatever thing it is that they hopefully want to do, and it's not you projecting onto them You're able to make it work because the clothing is paid for, the hotel is paid for because you got a sponsorship from somebody and now you're able to go there. You wouldn't have been able to otherwise, and so I think that's why they just keep sitting there, because you can get so much more again, more than just like dollars, you get the crap you know so.
Speaker 2:So this, this is something that kind of pops into my head. It's like I mean, now you're, now you're placing views on the child's worth. That's going to cause a lot of difficulties as far as in the future. I'm curious to see what Honey Boo Boo is up to today. I'm curious if she's ever spoke out against or for whatever her upbringing.
Speaker 1:I know that she has sort of stayed in the media. I don't know where she is exactly right now. She stayed for quite a while and she would pop up periodically. Part of that was because that that family is a hot mess and so sometimes it was, you know, mom or dad that was in the media for the wrong reasons, but I think that because she grew up in it and there was actually.
Speaker 1:There's a famous child influencer that I found as part of this. Her name is jackie dejo deyo. She's dutch originally and you know, kind of arguably, she was groomed by her parents like she was brought up in this environment, kind of like honey boo boo. But there are always cameras, you're always performing and so I think you lose that kind of like oh, this is, this is actually not okay that my entire life is just out there. I mean, she went, she went rated r this one. I don't know about honey boo boo I don't think she has, but jackie, since turning 18, has an only fans. Even before that she was posting risque photos and her parents allowed her to.
Speaker 1:They're like, oh, if she really wants to, it's okay. Like, yeah, we delete like dick pics off of her account. For her when she was younger, oh god, I was like the fact that we have to talk about that, the fact that we're talking about your 13 year old, because they started posting photos of her when she was about six photos and videos of her doing sports she likes to, um, I think she does snowboarding and like one or two other things that she was quite good at and so they were posting that for her. And then, when she was 13, she got her first swimwear brand deal and that's where things just like see a dicks man everywhere, and then like nude photos of her were published.
Speaker 1:Uh, and then she was like you know what, you can go ahead and publish it, I'll publish my own. And her parents were like, okay, and just let her. And so now you've got this like barely legal child running around posting, because she was brought up to think that that was okay and because she has made so much money off of this and her family has made so much money off of this and her family's made so much money off of this, no one can convince her that this is not okay, because she's like, yeah, but I make bank capitalism as finest yeah, in case you needed another reason to hate on it.
Speaker 1:Also, another fun statistic for if you're thinking that's like, oh, this is only for like really big accounts, that like smaller accounts are not going to have this issue, that is partially true. Smaller accounts tended to have far higher numbers of women followers, based on studies that were done again by the New York Times looking at it. Followers based on studies that were done again by the New York Times looking at it. Once you got to around 100,000 followers, often those accounts had over 75% of their followers, who were male Over 75%. And this is an account that is showing a young girl. There are very few accounts of boys in this, this situation. Obviously they do exist, but not anywhere near. You see the number of like young girls who are being posted on instagram by their parents, because at that age you're not posting yourself, it's your mom.
Speaker 2:She's the one who is posting these photos and videos yeah, they're that's you know, and and so okay, so the thing that that kind of comes into my mind is like rather than deleting these comments and shit like that, I don't know, uh, call the fucking police yeah, I mean you, you can report this type of thing and you you can and should.
Speaker 1:I know that a lot of people are, and there has been some discussion about the feelings of frustration, about I feel like I'm reporting stuff and nothing is happening, like I keep posting this and nothing is happening. We'll get into a little bit of that when we talk about where do we go from here. I do feel that part of the responsibility is just not posting this stuff Like we actually do have control. The internet does not necessarily need all of these photos and videos. I've been hating on Instagram for all this, because Instagram is the main social media that I use and so that's where I see it, but it's it's everywhere. It's Facebook, it's YouTube, it's TikTok, it's Snapchat.
Speaker 1:All of them are guilty of this. All of them have algorithms that push any content that has interaction. All of them are guilty of this. All of them have algorithms that push any content has interaction. All of them have known problems with like actually following up with like. Hey, this person is making like threats and really disgusting comments on stuff like they need to be blocked. All of them have that issue of actually following through.
Speaker 1:Yeah, yeah yeah, yeah, and I know that 100,000 followers sounds like a lot, but if you want to be making money, you are going to be needing that. Not necessarily that level, of course it depends. But you're not going to be getting sponsorship deals with 100 followers. You're not going to be getting sponsorship deals with 100 followers. You're not gonna be getting sponsorship deals with 1000 followers. Like. You do actually need to have that base to build up. So having 100,000 for somebody like this isn't a strange or like oh, there's only you know 10 kids out there who have more than 100,000 followers. No, there are a lot. So I mean we we could continue to talk about, like, all the terrible things that are happening out. I I think that is enough enough child predators to to give us a good base to say where do we go from here?
Speaker 2:I would say, yeah, don't fucking do it. And then, on top of that, I think that you know if we're, if, if somebody is, because that's a hard thing, because, like, if somebody, if, if a 40 year old guy is commenting, you know like, hey, hot stuff. Um, you know, like it's, it's, it's not against the law. And you know, unless they actually post something on long lines that is actually sexual, that no way in hell that it can be taken any other way but sexual, yeah, no, and that's that's fucking gross. Um, but it's I, I it's moments like this. I believe in public shaming.
Speaker 1:I would agree with you on that. I know you and I have had some difference in opinions on like the call out videos and that type of public shaming. But absolutely for this type of thing, I fully support tracking down these men. We don't dox them, don't publish like that information. But you know, no shame in figuring out who they are, making sure it's the right person. And then I don't know whoopsies.
Speaker 1:I sent a screenshot of this to your workplace. Were you aware that your employee talks like this? Were you aware that your employee is making these comments on videos and stuff like that to their wife? Are you aware that your husband's running around on Instagram and TikTok saying this type of thing? Now, I fully name and shame for this type of thing. I agree with that because unfortunately we can't count on.
Speaker 1:It'd be nice to say, hey, you know Instagram and TikTok, they're going to get serious. That would be ideal. You know Instagram and TikTok, they're gonna. They're gonna get serious. That would be ideal is if the companies would police themselves and say, hey, this is a problem. We know this is a problem. We've known it for years, that this is like a really big problem and so we're going to cut down on, you know who can see stuff and you'll be able to say like, oh, I don't want men to see my content, like to add that feature in. So if you're working with a child's account, that you could actually separate that and say they're not going to, because as long as you were on their site and watching and commenting on stuff, the more you do that, the more ads you see. And the more ads you see, the more money you make for them. It's capitalism.
Speaker 1:Again back to bite you in the ass, and so they're not going to. We are starting to see laws get passed, though Never thought I would have something good to say. Where do we go from here, bart? Now I'm going to tamp the enthusiasm down, because most of this is around payment for child influencers, which is a whole nother deal that, like, there are these kids acting and working Right and there's nothing to protect them when they turn 18 from their parents being like bye bitch and kicking them out and they don't get anything. They may not actually benefit from any of this stuff that is happening for them. It's just mommy who makes all the money. Mommy and or daddy Some of these are the family vloggers. We are at least starting to see laws get passed that say that you have to compensate child influencers. 50% of earnings for a piece of content must be placed in the block. Trust fund is actually a law that Illinois passed.
Speaker 2:Wow Illinois.
Speaker 1:I know Now that depends on how much of the video the kid is in. So it may be less if the kid is only in like a tiny little second of the video, but if they're in and I forget what percentage of the video you were required to put this bit aside for them again in a block trust fund. And it was actually a teenager who helped get this passed. A 16-year-old saw what was happening and was like damn, I wouldn't want my most intimate private moments of a child to be published online and then to have everyone know this and see it and I not be able to benefit at all or be able to delete this. So she helped get this passed, which is good for her.
Speaker 2:Go Jen Alpha.
Speaker 1:Right. I was like, oh, putting the rest of us to shame. California's in the process of passing something similar. So they already have the Coogan Law for child actors and they're rolling child influencers into that. France has passed a law in 2020 regarding how under-16s can act on the internet not act on the internet, but content about them. What happens to their earnings? Going back to the? U? The US? Washington's working on a similar bill. We've kind of acknowledged that, like something should be done. Now we're not talking about all of the data that's out there at this point, like no one's really doing anything about that. Now. Europe, of course, has the right to be forgotten. Do you know the GDPR?
Speaker 2:No.
Speaker 1:Okay, so the laws passed in Europe five years-ish ago five, six years ago, maybe a little bit earlier. That includes a whole bunch of regulations and one of them is that, for instance, if you're ever here in Europe and you're trying to get on the internet, every different site you go to, you're going to have to either say yes or no to give them access to your data, and like which ones. This is essential, not essential cookies, that type of thing. That was part of it. One of the things that's also included in GDPR is the right to be forgotten.
Speaker 1:You can legally like you have legal backing to go into Google, for instance, and be like delete everything, delete this article about me, it's gone. It's gone from the internet, and this has extended in France, and I think in Europe that they're working to get kids the ability to kind of semi-retroactively, like when the kids become adults, they can be like I want this gone Now, unfortunately, like it can still exist other places, because you know, know, the internet is forever and people may have saved copies or screenshots or something like that. So it may not be truly, truly gone, but it at least gives them a better ability to be like I was forced to do these videos or take these photos when I was really young. I hate it. It ruined my life like. I don't like any of that and I, now that I'm 18, delete it. Delete that account I like that I like that too.
Speaker 1:I'd love to see more of that, and I think it's gonna have to be the government that steps in because the companies aren't going to, which is a sad day when you're like, please, government, come, save me oh god yeah, we fucked.
Speaker 2:I know like. So like I do know like California did pass something along the lines of um. You know, please don't sell my data yeah, gdpr. The California law is based off of GDPR, I believe and so, oh god, well, you know, and we're still at the childlike of the Internet.
Speaker 1:And look how bad it is now.
Speaker 2:Yeah, well, I think that's going to be true with anything. Yeah, you know, I think of like even cars. When they first came about, you know, there was no speed limits. There was no.
Speaker 1:I I mean seatbelts. Forced seatbelts is a relatively new thing yeah, so.
Speaker 2:So yeah, you know like um, uh, even turn signals and brake lights and so, yeah, I mean know at one point cars were, you know, as time progressed, you know they got safer and safer, but you know how many people got killed, you know, prior to that.
Speaker 1:No, exactly, and unfortunately the Internet is moving so much faster than regulation can, because regulation is just it's slow. It takes a long time for you realize the problem. You start talking about the problem, you make a big deal about the problem, you finally get it, you know, to the right people. The right people talk about, make a big deal about. You know, eventually it takes years before we finally like, oh, hey, hey, that's, it's illegal to do that, it's not a speedy process, but the internet moves so, so quickly that I need the government to speed up, do something, because by the time they finally react to this, we'll have moved on to other things, other, far worse things, and the sooner laws are set down. Because, again, the companies know if this information, I'm finding this information, like doing my searches, like it's not like I was doing some sneaky shit, like looking around within their like hidden internal files and I was like 500 000 accounts a day, what they published that information themselves almost five years ago and we, just we, we haven't gotten there, but like something has to be done to start saying this is illegal.
Speaker 1:That's illegal, you know, and it may not be top government, but like within countries even, but like the, the norwegian couple, julie and camilla. You know that's the type of thing where I look at them like norway would do well to they. They know who, julie and camilla. You know that's the type of thing where I look at them like norway would do well to. But they they know who julie and camilla are. They're famous influences within the country. They've been known for years. I am truly shocked that somebody hasn't kind of been like yo, child protective services, like we may be a little bit worried about, like is this child being exploited, child being okay, that type of thing. I'm a little bit shocked that there hasn't been more of a grass movement, grassroots movement, towards like reporting these people or towards pushing for even just local laws that say you can't do this, you cannot post your child on this age, you know, if you were caught commenting illegal, not racist, but really super sexual stuff. I'm shocked that we haven't acknowledged that that's a no-no.
Speaker 2:Oh God.
Speaker 1:So on our scale of toxicity, where would you rate mom fluencers? Because that's that's kind of the overarching theme for all. Like three of these episodes, mom fluencers are like the main issue here. Would you say that they are a green potato, make you kind of sick if you eat it, but just scrape off the green part and it's okay. Are they? They a death cap mushroom, 50-50 chance of death or coma, even when cooked? Or are they a delicious but deadly last snack of antifreeze?
Speaker 2:So I would put this at death cap plus Uh-huh. And the only reason why I don't say antifreeze is because, unfortunately, uh, influencers are how we learn to live nowadays. Um, yeah, sad but true. And you know, like being able to find the best way of having a play date and being able to, you know, introduce your kids to you know other kids and you know things like that. Yeah, that that's great and valuable and things like that. But I mean, when 75 of your viewers are fucking old guys yeah like that was a horrific figure.
Speaker 1:There were so many horrific figures that I ran across here.
Speaker 2:And even you know because as I was saying that like even giving benefits of the doubts to the men maybe half of them are single dads. Yeah, are single dads who are trying to figure out, like you know way to become, you know, a father slash mother to the child yeah, or maybe you know their daughter is also in whatever sport and so they're following along.
Speaker 1:Absolutely. You do, of course, have some of these men who are legitimately following. I was like you wouldn't look at me like two percent man. That's awful. But when you look at 75 percent of followers on a child's account, it's like even if half of you are legit which I don't think is true that's a lot of dudes yeah, you don't need to be following this.
Speaker 2:Yeah, yeah.
Speaker 1:So I would actually give this antifreeze because I think it has such a potential to ruin people's lives for way longer than we think. Again, we are just now kind of starting to see how it can ruin we know how it can ruin our lives as adults. People have learned that lesson and they're still learning it. But I think that as time goes on, in the next like 10-ish years, we're going to see that wave of kids who grew up on the internet, who are now becoming adults, and some of them may be like Jackie and they're like hell yeah, I'll take what I can get, but I think a lot of them are going to be like. Some of them may be like Jackie and they're like hell, yeah, I'll take what I can get, but I think a lot of them are gonna be like some of these other people that I saw who were like no, this is horrific, like all this information exists about me and I don't want it to, and people know intimate details about my life and my childhood that I can never erase and people will always know that, and so I'm always at risk because some crazy stalker followed me as a kid and now here I am, I'm 25, I'm 30, and they're still obsessed with me and they're still following me through other people or they know where I live. They found out about me and now, no matter what I do, they keep following. What I do, they keep following.
Speaker 1:I give the benefit of the doubt to the first momfluencers way back in the day, the mom bloggers. I don't hate on them for what they did, because we didn't know. I mean, think about the stuff we used to post on Facebook or MySpace. Nobody was talking about this type of thing back then. Yes, of course it existed. Of course child pornography was on the internet since day one. It's the internet. But this was not a conversation that was happening about like, hey, your photos could be taken. We didn't have AI that could animate a photo, a still photo of you and have you say or do really inappropriate things. That never actually happened. We have that technology now. That didn't exist. Back then we weren't thinking about that, and so I don't blame them for starting it and saying, yeah, I'm going to have this blog and, like, I'm going to post my kids on there and like you can follow along and see what we do.
Speaker 1:I don't blame the people who you know, want to post some photos of their kids on Facebook. I love seeing my friends' kids. I am always excited to see them, whatever it is they're doing, but there comes a certain limit where I'm like you are posting a lot of photos of your kid. How is your kid going to feel when they're older? Like, put yourself in their shoes. And that's even you know, and I hope you have your privacy settings in order for that.
Speaker 1:If you're out here posting a lot of photos of your kid, you absolutely should be checking. You can post them, though I'm not going to say like no one should ever post photos of their kids, because that's just no. But the people who are willingly choosing to create accounts for their children in this day and age, you know what is out there. You don't have the benefit of the doubt to say, oh well, I really want my seven-year-old to have these opportunities and to get these scholarships and it's not even scholarships like sponsorships for dance because it's her passion. I'm sure it is her passion, but you know full damn well that there are going to be child predators looking at photos of her and pedophiles looking at photos of her.
Speaker 1:Is that worth the money? And I think that when you say yes, it is. I think that makes you a really bad person. Yeah, yeah, hence antifreeze. I don't think you can be a momfluencer in 2024 or 2025 and be a good person. If you are posting photos and videos of your kids, if you are sharing details about them to the wider internet to make money. You're a bad person. You're a bad person.
Speaker 2:I think there is a proper way to do it, which sure You're not going to get as many views and things like that.
Speaker 2:but yeah you know, if you do want to, if you do legit want to help somebody out, and you found this new way of doing things. You don't have to show videos, you don't have to post pictures, you don't have to go over every single part of their lives. You can still say hey, I discovered, like you know, like a really good way to have my child gender neutral. My child you're not giving any information out to meet friends via online or whatever.
Speaker 1:Yeah, they like to go on this website and they can play these games and it's super educational. Yeah, I would agree that I think there are some very, very narrow ways that you can still be an influencer about kids and motherhood, and all of that without involving your kids. It just takes so much more work to do that. I think people aren't going to do it. They're going to have to put in a whole lot more work to get there and then it will always be a fight because the people who show their kids are going to get the views and the people who are careful are not going to get the views.
Speaker 1:Oh God, on that happy ending. If you also hate momfluencers, or if you have been traumatized by a parent who overshared, who got a little into sharenting and shared horrific things about you online, you can write to us at toxic, at awesome life skillscom. You can also write to us or follow us on social media. We've got all of them, except for Snapchat, cause don't got time for that. Follow us there, please. We like to see you and that's it until next week.
Speaker 2:This has been the toxic cooking show bye.