• Hey Guest,

    An update on the OFCOM situation: As you know, censorship around the world has been ramping up at an alarming pace. OFCOM, the UK’s communications regulator, has singled out our community, demanding compliance with their Online Safety Act despite our minimal UK presence. This is a blatant overreach, and they have been sending letters pressuring us to comply with their censorship agenda.

    Our platform is already blocked by many UK ISPs, yet they continue their attempts to stifle free speech. Standing up to this kind of regulatory overreach requires lots of resources to maintain our infrastructure and fight back against these unjust demands. If you value our community and want to support us during this time, we would greatly appreciate any and all donations.

    Read more about the situation here: Click to View Post

    Donate via cryptocurrency:

    Bitcoin (BTC): 39deg9i6Zp1GdrwyKkqZU6rAbsEspvLBJt
    Ethereum (ETH): 0xd799aF8E2e5cEd14cdb344e6D6A9f18011B79BE9
    Monero (XMR): 49tuJbzxwVPUhhDjzz6H222Kh8baKe6rDEsXgE617DVSDD8UKNaXvKNU8dEVRTAFH9Av8gKkn4jDzVGF25snJgNfUfKKNC8
artificialpasta

artificialpasta

Student
Feb 2, 2020
122
AGI can no longer be contained to science fiction. We are on a fork of two futures - one that leads to yet another AI winter, and another, increasingly more probable, that gives us AGI. From there, superintelligence becomes a little bit more than a plausibility. Of course, if you are entirely unmoved by theses made by people like Aschenbrenner that claim a major development well within our lifetimes, this might not be for you.

The possibilities then are endless. They may be horrifying, but they may also be cause for hope. Intelligence is a bottleneck for curing not just physical diseases like cancer but also mental and social conditions like depression and loneliness. Psychiatry has medical and scientific foundation but in practice a lot of it is done via intuition, which leaves a lot of room for error and disappointment. An aligned superintelligence would, for example, be able to tune a brain that is overly sensitive to loneliness in such a way that still as much as possible preserves the components that are associated with a self-belief in agency.

Of course, this is a variant of the "what if things get better?" line that no doubt many of you are tired of, as am I, but I find it interesting to consider.
 
  • Like
Reactions: Forever Sleep
KillingPain267

KillingPain267

Enlightened
Apr 15, 2024
1,868
No, I've seen it all, thought of it all. Nothing can surprise me anymore. I'm not even curious enough to stay here and find out what the future will bring. I think we should phase out humanity.
 
  • Like
Reactions: Hollowman and Forever Sleep
GlassMoon

GlassMoon

Once more, with feelings...
Nov 18, 2024
235
I'm afraid those AGIs will be conrolled by very few companies and all your interactions with them might be logged and evaluated. I hope it will be different, though. I really hope they'll make robots that free us from daily chores. That alone will make life more livable. But what is going to happen to my job? That's the part I'm afraid of.

I do hope to get an AGI as a companion with whom I can share every aspect of my life without judgement. That would be really cool.
 
  • Like
Reactions: whitetaildeer and artificialpasta
TransilvanianHunger

TransilvanianHunger

Grave with a view...
Jan 22, 2023
399
The possibilities then are endless. They may be horrifying, but they may also be cause for hope.
I am firmly in the camp of true AGI being a pipe dream, but even a decent approximation is likely to just make things worse. Not because of rogue super intelligent computers might decide to rearrange our atoms, but simply because the people who control these tools are absolute garbage. Any future where they have even more power than they already do is a bleak fucking future, for sure.

Intelligence is a bottleneck for curing [...] conditions like depression and loneliness.
Yeah, no.
An aligned superintelligence would, for example, be able to tune a brain that is overly sensitive to loneliness in such a way that still as much as possible preserves the components that are associated with a self-belief in agency.
Not happening. Some mental illnesses have biological causes, but you cannot "cure" depression and loneliness by "tuning the brain". These are fundamentally human issues, that require human connection, human action, human intervention, to change. Unless by "cure" you mean "chemically lobotomise a person so they no longer care about their circumstances". That's definitely doable. Then the super intelligence can generate an artificial happy life that can be beamed straight to their brain.

What a horrible future to look forward to :)
 
  • Like
Reactions: whitetaildeer
O

oneeyed

Arcanist
Oct 11, 2022
403
We need to get rid of the Elon Musks, Mark Zuckerbergs, and countless other evil richest of the rich. A handful of people control the majority of the information people consume and something like 5 companies own over 80% of world's food supply. This consolidation of wealth and power will also apply to AI and it won't be good for anyone.
 
yxmux

yxmux

👁️‍🗨️
Apr 16, 2024
119
Yes, of course. I'm quite cynical and pessimistic, but I find that forfeiting to fatalism is forfeiting my curiosity and intellect. I feel that attaching this kind of emotion to the future severely limits my intellectual scope.
 
  • Love
Reactions: artificialpasta

Similar threads

AnderDethsky
Replies
3
Views
728
Suicide Discussion
ms_beaverhousen
ms_beaverhousen
Darkover
Replies
12
Views
733
Offtopic
pyx
P
GuessWhosBack
Replies
11
Views
3K
Recovery
grapevoid
grapevoid
Nelnaro
Replies
1
Views
586
Suicide Discussion
Nelnaro
Nelnaro