Recent comments in /f/Futurology

meshtron t1_j96ncc7 wrote

(Assume this was directed at our friend who's celebrating the 60% of his DNA that matches a banana, but it's something I enjoy thinking about as well)

You're right that the physical distances involved mean it's impossible to rule out alien consciousness elsewhere in the universe. But despite what u/just-a-dreamer- seems to understand, you ALSO have to consider how infinitesimally small human existence is in time compared to the universe.

VERY generously, we've had the ability to receive any wavelength of signal arriving at earth from outside our solar system for maybe 200 years. Our planet has existed for less than 1/3 the life of the universe. It's entirely possible that other civilizations evolved, flourished, then died out (including their plante and even solar system) long before Earth ever existed, and likely that happened before we were able to even know the signal got to us.

We're still at the very beginning of trying to detect life beyond our planet, much less our solar system (and really not at all beyond our galaxy). The real question is whether humanity survives intact long enough to continue this search. I'd wager the chances of that happening are far less than the chances that intelligent life exists in the universe outside Earth right now.

2

just-a-dreamer- t1_j96hyta wrote

We will settle the galaxy for the same reason we ventured out of Africa. It is who we are. For the same reason the Greeks build city states at every location their ships could reach.

80.000 years, if it even takes that long with technology to come, is not that much.

Automated machines can prepare new star systems for settlements. While humans might be born out of stored DNA upon arrival. Or whatever we will look like at that point in time.

An alien civilization that follows the laws of evolution would have been detected by us already, at least in our galaxy. 1 million years is little in cosmic time, but gigantic in exponential growth what we will accomplish.

1

just-a-dreamer- t1_j96g8t6 wrote

Capitalism dies in the age of abundance, for trading labor for goods and services would become pointless. That is a goal worth of any risk, including extinction.

Besides, what are we left with without AI progress? The climate is collapsing slowly and international conflict is coming back. This time with nuclear arsenals. We were 1.5 billion in 1900, now we are at 8 billion in 2023.

Too many people competing for a smaller and smaller pool of available resources.

We either grow in technology or we die, that is our only path forward. If we could eradicate capitalism along the path, humanity could vote on a shared destiny how we want the world to look like for future generations to come.

1

theedgeofthefreud t1_j96g3bs wrote

Even if an alien civilization did inhabit all of its own solar system, we would be currently unable to detect them. We can't actually observe any of the world's they would inhabit. Also, what makes you think that we would inhabit half the galaxy? A quick Google search informs me that currently a trip to proxima centauri would take 80,000 years. 80,000 years is a long time. Consider how humans lived 80,000 years ago, and try to envision how they will live 80,000 years in the future. It's pretty close minded to be so sure that there is nothing alive in such a vast place as this universe. Perhaps you've seen that picture describing the reach of our radio transmissions? https://www.planetary.org/space-images/extent-of-human-radio-broadcasts

1

SIGINT_SANTA t1_j96eukp wrote

The scale of AIs threat of annihilation is much higher than that of nuclear weapons. And the incentives to improve AI are much stronger than the incentive to make more dangerous nuclear weapons. And the challenge of preventing AI proliferation is much harder than the challenge of preventing proliferation of nuclear weapons.

I also think it’s unlikely that a nuclear war would actually cause human extinction. It would certainly kill a ton of people (perhaps almost all). But even in the worst case scenarios it seems very likely that a few million would survive in New Zealand or some other remote location.

And do you really hate capitalism so much that you would kill your family and friends to end it? Really?

1

just-a-dreamer- t1_j96eegh wrote

By the law of exponential growth.

Human life dates back 5 billion years. We made ourselves known to the galaxy just 65 years ago with the atomic bomb. 100 years ago with faint radio signals.

Within 1.000 years we will inhabitate every planet in the sol system and our existence is seen like a beacon. In 1 million years we could have settled half the galaxy.

Any alien civilization that is just 1 million years ahead of us in evolution would have been detected by us by now. 1 million out of 5 billion years in not that much margin.

0

theedgeofthefreud t1_j96cxni wrote

You do recognize that exo planets are detected by the rotation of their stars, not by direct observation, and that the closest star is 4.25 light years away? How in the universe could we say if there are aliens or not? Especially considering that there are at least 100 billion stars in our galaxy and that there are likely 125 billion other galaxies.

1

just-a-dreamer- t1_j968uvg wrote

Humans have the tools to destroy the world since around 1960. AI adds nothing new to the threat of anahilation.

A self improving AI has the capabilities to bring about an age of abundance for humanity. Or wipe out all humans. Both will be the end of capitalism, which is good, one way or another.

In this inter-connected world, even the rich can't contain AI forever, the technology will spread eventually across the globe I think. Somebody somewhere will push AI development further for some reason of self interest.

Even the rich are not that united, there are so many layers of billionaires fighting each other. For example, the Saudi aristocracy is insanely rich as a group, but internally 1.000 male princes fight for power in their respective game of thrones.

1

SIGINT_SANTA t1_j966hxh wrote

It seems very optimistic to assume AI will mean the end of capitalism or abundance for all.

The most likely outcome seems to be it destroys us in pursuit of some objective we gave it. If you don’t think that’s a possibility I suggest you read what some of the people working on AI alignment have written.

But if by some absolute miracle that doesn’t happen, AI is going to be greatest tool of power concentration we’ve ever created. Whoever controls powerful AI would basically run the world. The default outcome is that some large company or government will have their hands on the levers.

1