• 1 Post
  • 255 Comments
Joined 2 years ago
cake
Cake day: December 13th, 2022

help-circle

  • I have had a lot of success.

    First I went through this list and filled out lots of opt-out forms: https://github.com/yaelwrites/Big-Ass-Data-Broker-Opt-Out-List

    Then I emailed everyone on this (dated) list: https://github.com/privacybot-berkeley/privacybot/blob/master/app%2Fservices_list_06May2021.csv

    I got a lot of follow up emails suggesting that my requests were honored, or having me do follow up steps to have my request honored. Only two pushback emails suggesting that they don’t have legally respect my opt out, to which I provided them a strict do-not-contact request that hopefully encourages them to do nothing with any data they might have on me.

    It was a huge amount of effort (~50 hours), but my private info was nearly ungooglable a couple of weeks after I finished.

    After seeing a huge difference, I then signed up for DeleteMe, and compared to a few friends of mine who never did their own work before signing up, my quarterly reports are extremely sparse. Hopefully it’s easier for them to play whack-a-mole with my data since I did lots of the initial work.

    So far I can’t find anything about myself I wouldn’t want to see on any search engine. If I did, I would just find the opt out page and try to nip it in the bud.



  • I highly recommend reading “I’m Glad My Mom Died” by Jennette McCurdy, and watching the “Quiet on Set: The Dark Side of Kids TV” Docu-series. It really helped me understand the mentality of parents like this and what they personally want out of it. “Reliving their youth vicariously” is a pretty good summary, along with the belief that making their kids famous will guarantee them a bright future.

    But it absolutely exposes children to exploitation and abuse, and sometimes the abuse can be quite extreme.



  • The problem with posting pictures of kids in closed groups is that pervs will just join those groups because they have what they’re looking for. You’re basically making it easier for them.

    It’s not that parents are afraid of their kids being part of a training set, though that is a bad thing in and of itself. It’s more about all of these AI undressing app ads that are showing up on every social media site, showing just how much of a wild-west situation things currently are, and that this brand of sexual exploitation is in-demand.

    Predators are already automating the process so that certain Instagram models get the AI undressing treatment as soon as they upload an exploitable pic. Pretty trivial to do at scale with Instaloader, GroundingDINO, SAM, and SD. Those pics are hosted outside of Instagram where victims have no power to undo the damage. Kids will get sexually exploited in this process, incidentally or intentionally.







  • On the flip side, search for “mom run” or “parent run” on Instagram to see the kids whose parents have decided to parade in front of thousands of people online. Usually moms posting their little girls in leotards and swimsuits for their mostly mostly adult male followers… 🤢🤮

    But don’t worry, Meta isn’t complicit, if you search “child model” they give you a scary child abuse warning message.

    Someone else on Lemmy pointed this out a while back, and after seeing it for myself that firmly solidified my decision to stay the fuck away from anything Meta does.