Recent comments in /f/singularity

drsimonz t1_ja9s2mx wrote

Absolutely. IMO almost all of the risk for "evil torturer ASI" comes from a scenario in which a human directs an ASI. Without a doubt, there are thousands, possibly millions, of people alive right who would absolutely create hell, without hesitation, given the opportunity. You can tell because they....literally already do create hell on a smaller scale. Throwing acid on women's faces, burning people alive, raping children, orchestrating genocides, it's been part of human behavior for millennia. The only way we survive ASI is if these human desires are not allowed to influence the ASI.

2

AndromedaAnimated t1_ja9s24s wrote

An excellent description of a possible new world that, for a change, isnโ€™t dystopian. I applaud you ๐Ÿ‘ This is a world to strive for. If only most people of today werenโ€™t as enamored with the โ€žgrindโ€œโ€ฆ

I am pretty sure that once we have good education possibilities and no necessity for horrible, demeaning, useless โ€žworkโ€œ in hierarchies that are ruled by those high in dark triad behavior, people would finally be able to contribute to society with their real talents and abilities. And maybe even rekindle their own empathy and prosocial behavior.

Thank you.

8

dasnihil t1_ja9qyc5 wrote

to add to this bitchass complexity, the brain's activity is not just for us thinking and talking but it's regulating your lungs and heartbeat and plethora of noisy signals going on in there. i can imagine us toying with very specific regions of the brain for ignoring most of the noise. it's going to be a fascinating decade. all our dreams come true, both good ones and frightening ones.

16

drsimonz t1_ja9q5av wrote

That's an interesting question too. Alignment researchers like to talk about "X-risks" and "S-risks" but I don't see as much discussion on less extreme outcomes. A "steward" ASI might decide that it likes humanity, but needs to take control for our own good, and honestly it might not be wrong. Human civilization is doing a very mediocre job of providing justice, a fair market, and sustainable use of the earth's resources. Corruption is rampant even at the highest levels of government. We are absolutely just children playing with matches here, so even a completely friendly superintelligence might end up concluding that it must take over, or that the population needs to be reduced. Though it seems unlikely considering how much the carrying capacity has already been increased by technological progress. 100 years ago the global carrying capacity was probably 1/10 of what it is now.

14

ThatUsernameWasTaken t1_ja9pvz7 wrote

โ€œThere was also the Argument of Increasing Decency, which basically held that cruelty was linked to stupidity and that the link between intelligence, imagination, empathy and good-behaviour-as-it-was-generally-understood โ€“ i.e. not being cruel to others โ€“ was as profound as these matters ever got.โ€

~Ian M. Banks

4

Liberty2012 t1_ja9o3vy wrote

I think it is rather the majority of individuals just want to pursue other work or interests that they hope AI will provide in some manner directly or indirectly.

As to whether this will work out as some hope is certainly worthy of thought exploration, but I think the motives for most are not exactly as you have stated them.

2