Recent comments in /f/singularity

sumane12 t1_jdlj54u wrote

This is the answer.

Considering life on this planet went through atleast 5 major extinction events and took 4 billion years to create creatures intelligent enough to leave the planet, it's likely that space faring civilizations are extremely rare. Now let's give a low range estimate that the first intelligent creatures arrived approximately 4 billion years after the big bang, that gives them at the very maximum, a 9 billion light year sphere of radio broadcast. Given that the observable universe is 93b light years accross, that first civilization would have only sent out radio waves in 10% of the universe, which means we could very easily be in a part of space that it hasn't reached yet.

This all falls apart if FTL is achieved, but I suspect if that's the case, then it won't really matter.

1

jsseven777 t1_jdldl2p wrote

I had a similar theory that there is an epic AI space war between AIs that killed their respective civilizations and all the AIs are fighting each other, and they leave young civilizations like us alone because they aren’t really threatened by us, and maybe they even find our data useful in some ways.

0

D_Ethan_Bones t1_jdlbd8w wrote

My own personal guess for (lack of finding alien civilization) is that we've only barely gotten started searching.

Extent of human radio broadcasts. If we were overlapping with another such circle, how well would we know - how well would we have known 25 years ago, 50 years ago, 75 years ago? People finding the cosmic background might have found something else with it but we're still finding exoplanets because our vision is steadily improving. They were in an impenetrable fog at the turn of the century.

"We'll know when we're older" is what I would call an optimistic answer. Maybe there is something interesting out there, but we will need further tech advances or at least more time to use what we have. AI and future humans hopefully improve upon what we've got already.

3

blueSGL t1_jdl93th wrote

> AGI, Theory of Mind, Creativity

Marvin Minsky classified words such as these as “suitcase words”. As in a word into which people attribute (or pack) multiple meanings.

These words are almost like thought terminating cliches, as in when they are spoken it assures the derailment of the conversation. Further comments will be arguing about what to put in the suitcase rather than the initial point of discussion.

15

alexiuss t1_jdkpevl wrote

Good positive points about robots, however I believe that large language models will teach robots tasks instantly even before we make the robots. Language models are going to explode into all sorts of potential stuff soon.

3