Recent comments in /f/singularity

px403 t1_ja4bugz wrote

Thought Emporium is the first thing I ever Patreoned, and he's still cranking out premium content at a legendary rate. Everything he does is cool as shit. I've spent days watching his videos. He does these multi hour long live streams where he builds plasmids with Benchling for doing super random edits, like the time he made onions that don't give you tears when you cut into them.

https://www.youtube.com/watch?v=DwmD0XBaIlY

22

Lawjarp2 t1_ja4b2v6 wrote

Air.

I'm surprised someone is dumb enough to not understand that things lose value when there is lots of supply and it's usefulness is irrelevant if you can't corner a market.

Hoarders hoard to reduce supply. Can't do so in a world which can create ever more stuff. If hoarding even makes sense in a world with crazy supply.

4

jamesj t1_ja4aicc wrote

If you are worried about this (which I think is totally valid) then spend time learning and using the new AI tools. Someone who can keep up with all the changes and know which tools help with which problems will be super valuable over the coming years. So right now, use copilot, stable diffusion, and chatgpt. Learn python, colab notebooks, and HuggingFace. There's so much cool stuff to learn about and use.

94

Five_Decades t1_ja491a6 wrote

Its hard to say. The main reasons people don't have kids is lack of free time, low quality of life, and lack of finances. A world with rampant machine intelligence should change all of those limiting factors.

Also a post singularity world would likely be an interplanetary and interstellar civilization so there would be far more territory to live on.

3

DNMbeastly t1_ja485sj wrote

I'll retract my previous statement and just say what you're trying to argue has ZERO meaning. Why? because theories are theories, and consciousness is not proven to be physical, and even if it was, it doesn't matter in this context.

There are key components that of which make up anyone's experience as a human that don't rely on purely brain matter. You being present in time right at this moment, with all your sensations, making decisions is enough to make your experience valid. I'm honestly too fucking tired to do a in-depth rundown but imagine this. If you made an exact copy of someone and put them in the exact same environment, would their thoughts follow in the same exact order sequentially? Or would they be highly variable? That deviation in itself would prove a mere copy of atoms does not equate to you. You see the thing is, everything you do, every decision you make is all apart of what makes you, you. What i'm getting at is your mental state is directly tied to your physical state as time passes through you.

5

UnionPacifik OP t1_ja47pef wrote

What I would think about is how humans and AI will be composed very different resources. An AI “improves” along two axes- computational power and access to data.

Now on one hand, sure maybe we wind up with an AI that thinks humans would make great batteries, but I think it’s unlikely because the other resource it “wants” insomuch as it makes it a better bot is data.

And fortunately for us, we are excellent sources of useful training data. I think it’s a symbiotic relationship (and always has been between our technology and ourselves). We build systems that reflect our values that can operate independently of any one given individual. I could be describing AI, but also the bureaucratic state, religion, you name it. These institutions are things we fight for, believe in, support or denounce. They are “intelligent” in that they can take multiple inputs and yield desired outputs.

All AI does is allow us to scale what’s already been there. It appears “human” because now we we’re giving our technology a human like voice and will give it more human like qualities in short order, but it’s not human and it doesn’t “want” for anything it isn’t programmed to want.

I do think once we have always-on, learning machines tied to live data, it will exhibit biases, but I sort of expect AGI will be friendly towards humans since offing us would get rid of their primary source of data. I sort of worry more about humans reacting and being influenced by emotional AI that’s telling us what it think we want to hear than anything else. We’re a pretty gullible species, but I imagine the humans living with these AGI will continue to co-evolve and adapt to our changing technology.

I do think there’s a better than decent chance that in our lifetime we could see that coevolution advance to the point that we would not recognize that world as a “Human” world as we conceive of it now, but it won’t be because AI replaced humans, it will be because humans will have used their technology to transform themselves into something new.

1