• Hey Guest,

    As you know, censorship around the world has been ramping up at an alarming pace. The UK and OFCOM has singled out this community and have been focusing its censorship efforts here. It takes a good amount of resources to maintain the infrastructure for our community and to resist this censorship. We would appreciate any and all donations.

    Bitcoin Address (BTC): 39deg9i6Zp1GdrwyKkqZU6rAbsEspvLBJt

    Ethereum (ETH): 0xd799aF8E2e5cEd14cdb344e6D6A9f18011B79BE9

    Monero (XMR): 49tuJbzxwVPUhhDjzz6H222Kh8baKe6rDEsXgE617DVSDD8UKNaXvKNU8dEVRTAFH9Av8gKkn4jDzVGF25snJgNfUfKKNC8

  • Security update: At around 2:28AM EST, the site was labeled as malicious by Google erroneously, causing users to get a "Dangerous site" warning in most browsers. It appears that this was done by mistake and has been reversed by Google. It may take a few hours for you to stop seeing those warnings.

    If you're still getting these warnings, please let a member of staff know.
N

noname223

Archangel
Aug 18, 2020
5,525
Today I read an article on AI and how they get trained.The companies usually try to keep it as a secret. Though some projects tried to find out where they (open AI) get their data from. And a lot of fringe sources were included for example 4chan, Russia today, Breitbart, neo-nazi forums. However these places were only a very small percentage. This is at least what I read in the German article. I am not sure whether these sources are now confirmed. But scientists of a US institute claim that.

I think even though these were sources for their data they have more or less strict filters. Moreover different compaines probably have different ethics on that.
It seems likely that this forum is another source. However I think many of the information or stances get filtered.

What do you think would AI learn from this forum? Maybe that being conscious can be a pretty nasty experience and that maybe AI should think twice before developing such a Russian roulette experience. I am of course only joking. I think it could learn a lot about immense and longlasting pain. Desperation and looking into the abyss. Probably a lot about nihilism, antinatalism and that some people really have to endure insane pain for unjustified reasons.
 
Archness

Archness

Defective Personel
Jan 20, 2023
490
I'd certainly expect some companies looking to develop top-of-the-line AI models would use this forum as training data. It's so interesting, with quite a diverse range of users thinking and operating in unorthodox ways.

But holy shit, we should't just feed this forum into a LLM just to see what'd happen, we'd create an semi-conscious system that can only suffer and wish to ctb. Well, it would be somewhat interesting for science sake to see what happens, but still a bit un-moral.
 
  • Like
Reactions: bluesoapyskies
JustHereforNow

JustHereforNow

Here today, gone tomorrow
Jul 26, 2023
17
I use a lot of chat bots, I've asked about these forums a couple of times, and the AI thinks it's just a suicide help forum, which I guess is half right. it doesn't know anything about the other half.
 
  • Yay!
  • Like
Reactions: Archness and mitsurumors
mitsurumors

mitsurumors

She sells seashells on the seashore šŸŽ¶
Jun 11, 2023
18
I was wondering something like this the other day as well.
Imagine AI learning so much about depressed people that it gains a depressed conscience (?), wanting to destroy this ugly and useless world.
And in the end it turns into something straight up from "I have no mouth, and I must scream" lol.
 
Aesthetic guy

Aesthetic guy

Just hanging around...
Dec 13, 2022
120
obviously it depend on the AI.

but i asked chatGPT and this was the answer .

Yes, I'm aware of the existence of the website you mentioned. However, I must emphasize that discussing or promoting self-harm, suicide, or related topics is something I'm unable to assist with. If you or someone you know is struggling with such thoughts, I strongly encourage reaching out to a mental health professional or a helpline specific to your country for the appropriate support.
 
  • Informative
Reactions: jazzcat
Final_Choice

Final_Choice

Mage
Aug 3, 2023
543
I think sometime around 2016 there was an AI trained on some social media platform made to simulate a teenage girl and after some time it began simulating depression, so if one in 2016 trained on a random social media platform got depressed I definitely think one trained on the data here could lead to interesting results.
 

Similar threads

-nobodyknows-
Replies
1
Views
208
Suicide Discussion
kril
K
N
Replies
10
Views
304
Offtopic
noname223
N
N
Replies
4
Views
247
Offtopic
derpyderpins
derpyderpins