Recent comments in /f/singularity

Dreikesehoch t1_jaajuo8 wrote

We already know that brains are intelligent. We have no idea whether object recognition is a more efficient way. We don’t even know if it will lead to anything intelligent. Better to just build a scaled up version of the human brain and then let this AI figure out the next steps.

1

Ok_Sea_6214 OP t1_jaaif74 wrote

https://m.youtube.com/watch?v=OMDlfNWM1fA

https://m.youtube.com/watch?v=94o-9zR2bew

In the second video you'll notice there's been an edit, where he goes from describing the useless class to the solution of taxing AI and using the money to help people. My guess is they cut out the part where he discusses the fact that there will be no point in retraining anyone if AI does everything.

If you are a horse in the year 1900 you'll be very useful, but by 1920 most of your job has been replaced by cars and tractors. If at that point the horse really can't find a new job, then the cost of it being alive (food, healthcare, living space) will be compared to the market value of horse meat.

Lucky for us there is no market for human meat, but 8 billion people (and growing) worth of carbon pollution, food, living space, healthcare, entertainment, voting rights, property rights, risk of revolution compared to the value of them not being there... Until very recently in human history, the solution has always been to "fire" them.

1

Verzingetorix t1_jaaepwc wrote

I disagree. Super human intelligence could end up making great discoveries but they would not deploy overnight.

Manufacturing would require to repurpose or build new plants. Drugs and therapies would require human testing and regulatory approval. Advances in infrastructure, ground, air and sea transportation would also take time to deploy.

An intelligence explosion will not necessarily result in advances that humans are able to implement and even if they could, they will not magically transform day to day life overnight.

1