Recent comments in /f/Futurology

h20ohno t1_jabcq7o wrote

I'm of the opinion that the objectively hardest to cross filters are in the past, the transition to complex organisms and becoming intelligent being the hardest.

With that said, I think the only real filter to come is a nuclear apocalypse, one that is so destructive that complex life is no longer possible. To pass such a filter requires some solution that negates nukes, whether it's countermeasures, global disarmament, becoming interplanetary or even just really good fallout bunkers, essentially if civilization has some way to recover from nuclear oblivion, I'd consider it a solution.

After nuking each other into dust is off the table, we'll probably have enough runway to spread out and counteract any other filters like climate disasters, cosmic rays, nanobot swarms and so on, there'll always be some small pocket of us that can rebuild at that point.

And your point on ASI becoming essentially omnicidal, It'd have to do so before other ASI systems can escape it's reach, and it just seems too unlikely to me (Famous last words :P)

3

AFutureBrighter t1_jabcmbq wrote

Think on this…. Global governance is a terrible idea, as it leads to a very select few—the “elite,” living like “gawds,” while everyone else lives in a giant open air prison. Productivity and innovation disappear, markets and economies fade, and mass starvation and disease become commonplace—Hell on earth is birthed by the good intentions of fools.

Secondly, there isn’t only one best way, for anything. So who decides what’s best; the acceptable “right” way?

0

AFutureBrighter t1_jabbm18 wrote

ANOTHER ANGLE TO FATHOM: This world should be governed as it currently is, in the cases of countries with so called free and fair elections, but, instead of by the demented corrupt debased criminals we currently have, by HONEST, DECENT, KIND, EMPATHETIC, CREATIVE, CLEVER, CURIOUS SOULS.

So called “intellectuals” have destroyed everything good humanity ever had, and the sheep who listened to them are as much to blame, while being complicit in their own doom, and they have taken the rest of us down with them.

0

undefined7196 t1_jab8hoq wrote

Any form of AI will be the product of the mind that creates it. All forms of basic AI we have, has all of our biases and beliefs because AI has to be taught and it is taught by its creator. We could possibly find a way around this but I don't see how. I build "AI" models for a living. You have to train the models on something or else they are useless, the only thing we have to train them on is ourselves.

2

ThisAcanthocephala36 t1_jab7qcz wrote

AI companies are going to lose every class action lawsuit currently taking shape, brought by artists whose work is being used, uncredited, without permission, in training data. A previous generation also convinced itself that laws did not apply to the Internet, and they lost. Good luck with your ignorant maximalism, I hope it takes you far before you hit a wall.

6

undefined7196 t1_jab71e2 wrote

> Unless there is some other driving factor of turning simple life into complex intelligent beings other than evolution through natural selection

And perhaps there is, but life, as we know it, can only be a product of mutation and selection. There is no evidence to suggest it can form in any other way. If you have some evidence, or even a general idea of a possible way, feel free to present it.

3

zchen27 t1_jab6jop wrote

1

the-ugly-dopeling t1_jab6919 wrote

You read what others had done and you took the next step. You didn't earn the knowledge for yourselves, so you don't take any responsibility for it. You stood on the shoulders of geniuses to accomplish something as fast as you could and before you even knew what you had you patented it and packaged it and slapped it on a plastic lunchbox, and now you're selling it, you want to sell it!

4