> Maybe I'm just weird, but "a tech-based superintelligence emerges and decides to destroy humanity because it was in a bad mood" is something I'm totally fine with.
It seems important to reflect on what you’re really saying here, and to ask how a) this is fine, and b) how “weird” sufficiently sums up the stance.
I'm fine with humans having outcompeted the neanderthals and whatever else there was. I'm just as fine with someone else doing it to us, at least if it's someone or something that is smarter than us and not some virus or bacteria that kills us all (which would be kind of lame). If an artificial superintelligence wants to take over the flame of progress, I'm fine with that.
But I'm also totally fine with considering that stance weird (or whatever word you'd prefer). I'm aware that others view these things very differently, just as others are much, much more worried about their individual demise than I am worried about mine (or theirs).
Those are not similar/comparable outcomes and I think there are two major factors that can’t be hand-waved away:
1. Consciousness. Despite advances in “intelligence”, we still have a very limited understanding of what makes us conscious, and whether or not consciousness can emerge from machines.
If machines are not conscious and are all that remain, I’d argue everything that could be construed to have value by humans is lost, and nothing from that point forward could be considered “progress”.
2. Suffering. “Winning” on an evolutionary timescale looks nothing like the failure modes of machines taking over. The reality of this scenario is a rather grim one, and not at all like the slow emergence, competition and eventual extinction of biological species.
And depending on #1, the true tragedy of #2 begins to take shape.
I think it’d be more apropos to frame this as humanity collectively committing suicide rather than some notion of the future of progress.
If consciousness is the universe experiencing itself, what you’re describing sounds like a kind of universal death.
Of course we can’t know what consciousness really is (or if earth is the only place it exists), but that seems like all the more reason to take these problems seriously.
Or if consciousness is more than an illusion. Fair points. I'm not sure if you can have a general intelligence without some form of consciousness. I don't believe in gods or souls, so I lean towards us not being special.
As for the how, I agree with you. I'd prefer it to not be terminators crushing heads under their iron feet while allowing us just enough room to run and hide and live in constant terror for centuries. But I doubt it will be, the power delta will be too large. It'll be like a game of Civilization where Gandhi is advancing on you with modern tanks and fighter planes while you've barely discovered the wheel.
It seems important to reflect on what you’re really saying here, and to ask how a) this is fine, and b) how “weird” sufficiently sums up the stance.