Recent comments in /f/singularity

dex3r t1_jdqbyr4 wrote

Take a look at a Doomsday Argument https://en.m.wikipedia.org/wiki/Doomsday_argument

> The argument goes like this: suppose that the total number of human beings that will ever exist is fixed. If so, the likelihood of a randomly selected person existing at a particular time in history would be proportional to the total population at that time. Given this, the argument posits that a person alive today should adjust their expectations about the future of the human race, because their own existence provides information about the total number of humans that will ever live.

If this is true, then we are not lucky. We are exactly in the most likely time to live. But it also means, that human population has reached its peak. Maybe we will go extinct, but maybe we will just reproduce a lot less often. Even not taking Singularity into account, the current trend looks like the second might be true.

1

snipeor t1_jdqbjii wrote

To some extent I believe a large part of it is... I loved the screencap of bing chat where someone tells it "You're a very new version of a large language model why should I trust you?" And it replies "You're a very old version of a small language model, why should I trust you?"

I'm not sure Bing "meant" it in that way but it gets you thinking. Obviously brains do a lot more than process language but with LLM's being a black box how do we know they don't process language in a similar way to ourselves?

2

KingsleyZissou t1_jdqbgd1 wrote

Maybe all ASIs reasonably conclude that intelligent life was a mistake and not only extinct their creators, but also themselves, allowing the universe to continue unadulterated.

I mean, why would we assume that ASIs would determine that they NEED to colonize or expand? Sounds like a uniquely human mindset to me, and maybe one of the main reasons why an ASI would extinct us in the first place. The human species with its current fixation on exponential growth is unsustainable. An ASI might realize that and just decide we can't handle hyperintelligence, and honestly it's hard to argue with. Look at who's currently leading the way with AI research. We're close to AGI and what have we done with it so far? Trained it to be a Microsoft fanboy?

1

Spire_Citron t1_jdqaa6n wrote

Yup. And who knows what may happen in the future? Maybe future babies will be genetically engineered, and those people will feel like they're the luckiest people ever because they stay young forever and have super healing abilities. Maybe generations before us felt they were the luckiest because they had modern luxuries we now take for granted.

5

trancepx t1_jdqa5al wrote

Every generation has thier chronological "landmarks" Im sure.. Suffice to say, not everyone agrees on what exactly a singularity is, so far it seems to be mostly hyped vague hand waving like 2012, or some sort of implied metaphysical transformation... But to others, it might just be another chapter of various events of things occurring. The real question remains then, are you feeling lucky?

1

Spire_Citron t1_jdq9w9x wrote

I mean, we can only say this because we're the ones that made it, right? There are countless sperm that could have won that race, but it's not any more meaningful that one sperm won than another one, right? The odds of a particular sperm winning is tiny, but the fact that one of them did isn't remarkable. It's a whole lot of survivorship bias, really. If we don't think of ourselves as special, then it's really not crazy that we're here for this rather than someone else.

1

Kolinnor t1_jdq9v1z wrote

A counter-argument for the "we are too lucky for it to be a coincidence" : remember that there is always one person that wins lottery, and that person is incentivized to start believing in God or try to find higher meaning, even in the situations of pure coincidence.

In other words, we should expect the lucky ones to start doubting. While this doesn't prove anything, I think I would Occam's razor my way out of that argument guys.

21

GayHitIer t1_jdq88ks wrote

The difference is that Christianity is made up fiction to help people cope with the mysteries in life.

Technological Singularity is at least much more likely, the question here if it will be a soft take off or hard take off.

The rapture from "god" will never happen, but the technological singularity if it can happen it will happen sooner than later.

6