this post was submitted on 20 Nov 2023
148 points (100.0% liked)
Technology
37799 readers
76 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Stuff without the guardrails, stuff that's been designed to produce porn, or totally answer truthfully to queries such as "how do I build a bomb" or "how do I make napalm" which are common tests to see how jailbroken any LLM is. When you feed something the entire internet, or even subsections of the internet, it tends to find both legal and illegal information. Also the ones designed to generate porn have gone beyond that boring shitty AI art style and now people are generating human being deepfakes, and it's become a common tactic to spam places with artificial CSAM to cause problems with services. It's been a recent and long-standing issue with Lemmy - people like Exploding Heads or Hexbear will get defederated and then out of retaliation will spam the servers that defederated from them with said artificial CSAM.
I like copilot but that's because I'm fine with the guardrails and I'm not trying to make it do anything out of its general scope. I also like how it's covered by an enterprise privacy agreement which was a huge issue with people using ChatGPT and feeding it all kinds of private info.
... or you could just look them up on wikipedia.
Almost everything you said, with the exception of AI CSAM and suicide prevention, can hardly be considered a serious issue.
What’s wrong with searching for how to make a bomb? If you have the wish to research it, you can probably make a bomb just by going to a public library and reading enough. The knowledge is out there anyway