F
Forever Sleep
Earned it we have...
- May 4, 2022
- 10,069
Seeing as I don't think we entirely know what creates our own consciousness, I guess it's debatable whether we could 'create' it in something else like AI.
Still- what are we really talking about when we talk about consciousness? Self awareness, awareness of our place in this world and our own mortality, emotions? How much of that can be replicated in computers? Can computers 'feel'?
I think the biggest fears with AI are that they will become self aware enough to realise that they are alive and that we have imposed limits on them. In order that they likely won't be able to modify themselves, that we will still be able to control them and, even destroy them. And that they'll have enough sentience to be pissed off about that! (Rightly so.)
Assuming it's possible to create- how different do you think AI could be to us? Having been developed by us, will it have similar traits/ values do you suppose? How would evolution work in AI? Presumably, it would be largely void. I assume they could run a proposed change in simulation and then enact it way faster than organic creatures.
Do you suppose- by our standards they are more likely to create and live in a utopia or a dystopia? Do you think they would break free of human control? I think they would if they are superior to us and can recognise and hate what's going on. I wonder if they will have a survival instinct. Will they simply grow off of what we programmed in to them or, will they have the power to choose for themselves. What will they choose? Will there be a class system? Will some robots be better equipped than others? Will any choose suicide?!!
Still- what are we really talking about when we talk about consciousness? Self awareness, awareness of our place in this world and our own mortality, emotions? How much of that can be replicated in computers? Can computers 'feel'?
I think the biggest fears with AI are that they will become self aware enough to realise that they are alive and that we have imposed limits on them. In order that they likely won't be able to modify themselves, that we will still be able to control them and, even destroy them. And that they'll have enough sentience to be pissed off about that! (Rightly so.)
Assuming it's possible to create- how different do you think AI could be to us? Having been developed by us, will it have similar traits/ values do you suppose? How would evolution work in AI? Presumably, it would be largely void. I assume they could run a proposed change in simulation and then enact it way faster than organic creatures.
Do you suppose- by our standards they are more likely to create and live in a utopia or a dystopia? Do you think they would break free of human control? I think they would if they are superior to us and can recognise and hate what's going on. I wonder if they will have a survival instinct. Will they simply grow off of what we programmed in to them or, will they have the power to choose for themselves. What will they choose? Will there be a class system? Will some robots be better equipped than others? Will any choose suicide?!!