> When my best bet for humanity is a global nuclear war resetting civilization to give us a chance to deal with AGI issue down the line - I'm fine with religious doomism.
I'd settle for corporate death sentence for any companies developing it with life imprisonment for any employees/corporate officers involved at any level whatsoever (or VCs funding it)
including selling chips/compute to companies involved
any potential benefits from AI are not worth even a 10% risk of it wiping out humanity
this position is going to become more common as the media (and electorates) grasp what these companies are attempting to do
All those companies really want are the profits generated from mindless slaves so they don't have to rely on human labor for production.
Its a fools journey because it causes a self-fulfilling prophecy of destruction given existing societal mechanics and they are psychopathic enough to think that's not how it would turn out.
What happens when a large number of people can't get food, and they know why.
What happens when you have a large number of locusts eating all the food being produced.
How would AI differentiate between human thought, and the pests it was designed to eradicate?
Thinking machines should be outlawed, and those involved in its research purged.
They threaten all humanity, its children, and its future, and the thing about percentages is people often don't get how they actually work with respect to probability and likelihood.
Given sufficient time, as long as its on the distribution curve, it will eventually happen.
A 1% chance of an outcome, recalculated in a loop every moment with an infinite time will eventually land on that outcome at some point. Once that outcome occurs, everyone's dead, it may not be instantaneous because time is not a constraint.
I'd settle for corporate death sentence for any companies developing it with life imprisonment for any employees/corporate officers involved at any level whatsoever (or VCs funding it)
including selling chips/compute to companies involved
any potential benefits from AI are not worth even a 10% risk of it wiping out humanity
this position is going to become more common as the media (and electorates) grasp what these companies are attempting to do