• 0 Posts
  • 274 Comments
Joined 1 year ago
cake
Cake day: August 22nd, 2023

help-circle

  • Man, they could have made the letter something that would persuade people about the importance of ideas and how no nation is a monolith, but they just couldn’t help but make it a blatantly “Israel is right” letter.

    “We continue to be shocked and disappointed to see members of the literary community harass and ostracise their colleagues because they don’t share a one-sided narrative in response to the greatest massacre of Jews since the Holocaust.

    “Israel is fighting existential wars against Hamas and Hezbollah…"

    Someone here is obfuscating reality, and it’s not the boycotters. These people are insane.




  • Absolutely agree. My comment above was focused on whether some minimal amount of CSEM would itself make similar images happen when just prompting for porn, but there are a few mechanics that likely bias a model to creating young-looking faces in porn and with intentional prompt crafting I have no doubt you can at least get an approximation of it.

    I’m glad to hear about the models that are intentionally separating adult content from children. That’s a good idea. There’s not really much reason an adult-focused model needs to be mixed with much other data. There’s already so much porn out there. Maybe if you want to tune something unrelated to the naked parts (like the background) or you want some mundane activity, but naked, but neither of those things need kids in them.


  • I have not personally explored AI porn, but as someone with experience in machine learning and accidental biases that’s not very surprising to me.

    On top the of the general societal bias towards youth for “beauty” related roles, smoother and less-featured faces (that in general look younger) are closer to an average face so defaulting to that gets a bit of training boost (when in doubt, target the mean). It’s probably also not helped by youth-related porn keywords (teen, daughter, young) that further associate other porn prompts (even ones not about youth) with non-porn images of underage women that also have those keywords.


  • I assume any CSEM ingested into these models is absolutely swamped by the massive amount of adult porn that’s much more easily available. A handful of images aren’t going to drive model output in datasets of the scale of the image generation models. Maybe there are keywords that could drill down to be more associated with the child porn, but a lot of “young” type keywords are already plentifully applied to adults, and I imagine accidental child porn ingests are much less likely to be as conveniently labeled.

    So maybe you can figure out how to get it to produce child porn, but it probably won’t just randomly produce it for an innocent porn prompt.



  • Edit: I misread the posted image, OP is suggesting rules to filter new accounts, not with a new account themselves.

    A brand new account getting banned tells us almost nothing about whether the ban was warranted. Brand new accounts talking about automod are either evading a ban or have history on their main account they don’t want the mods to see. These might be horrible power-abusing mods, but even if there wasn’t a larger history not seen here, banning a brand new account just because the vibes seem off is a-ok in my book.



  • There’s no reason to take this guy or his organization at face value when they make claims. It’s been hype and hopium for a decade now, fueled by TED Talks and wunderkind-loving media.

    Cleaning up the garbage patch isn’t just a matter of collecting nicely floating big pieces of plastic. Doing that is good, but it’s not actually something that can ever get it to “clean”, it’s just something that helps slow the accumulation over time. You get the big stuff (relatively) easily, then it gets progressively harder, and eventually impossible.

    Which is progress. It’s just not the lofty result they keep promising. If all it took was a big net and a relatively modest (by government standards) budget, this wouldn’t be a problem.




  • You’re just not addressing automation at all. We have no where close to a billion people specializing in tasks that can’t either currently or in the near-term future be either automated entirely or made so efficient the required workforce would be drastically reduced. You don’t need 4 billion people to maintain (and improve) our standard of living and we’re rapidly approaching the point where many jobs are better automated than done by people.

    If you want people to be free to innovate or make art or explore, the best way to do that is to not have them working pointless jobs for half their waking hours.