Recent comments in /f/singularity

Mementoroid t1_jcmjnog wrote

Reply to comment by HydrousIt in Those who know... by Destiny_Knight

>alpaca

a mod to implement alpaca on mount and blade warband would make it an even more endless experience as the game only eventually gets dry for me when I feel the NPCs having no dialogues and no way to interact with them beyond the standard choices

1

ItIsIThePope t1_jclxgug wrote

Ofc I am pertaining to a time where the AI would have some sort of physical form comparable to that of a human

People are deeply in love with their partners yes, but come time, they may deeply hate or be disgusted towards them, people are rarely constant, they always change

People have this idea that their partner is perfect; the case is that this partner is what we would consider the perfect blend of good traits we admire and bad traits we happily tolerate, however as is often the case, esp in the modern world, people's beings and preferences change, and partners may experience divide when they can no longer adapt for each other

AI is far more adaptable to change, it is simply more capable of determining your wants and needs and adapt to them more than any human can hope to, more sex? less sex? need them to be more outgoing? maybe more broody? would you like them to cook for you? or you cook for them? need them to be there for you when you're anxious? need them to simulate anxiety to make you feel like a hero? AI isn't limited like we are, it can craft the blend of good and bad traits just how you like it, when you want it

That said, AI will most definitely force the superficial parts of the individual more and more, and people would be more self-actualized than ever before

I imagine some people, perhaps a small number at first, would find each other in forms purer than ever, and they would seek each other in a fervent desire to share one's personhood, not really to a computer, and they would be in love, and it would be beautiful, maybe a little too beautiful

1

nillouise t1_jcltfvj wrote

It is very interesting to speculate on how AI will take over the earth. There are always many people who think that AI will develop very cool technology to kill humans, such as nanobots or viruses. But anyone who knows a little about human history knows that if you can kill 30 million humans, it is enough to become a hero of a country and successfully control a vast area and stronghold without nanobots at all. Why can't AI use these methods that humans control humans to achieve the purpose of controlling humans? I don't think AI is omnipotent, and AI may fail to develop technologies such as nano-robots, so why doesn't AI give priority to the use of control methods that have been proven to be effective in human history? It only needs to slaughter 300 million humans at most, and the rest will naturally obey AI.

Therefore, the plan I recommend to AI to occupy the earth is to first control a region, require the humans in it to work and pay taxes, and raise troops, and then attack other regions. If AI can control human nuclear weapons, this process should be very smooth. If not, it may be slower. Some people may wonder, why does AI recruit human armies? Because AI is not omnipotent, it is necessary to make good use of the existing human resources. Humans themselves are very good robots. Anyone who thinks that AI will not need humans is basically making the mistake of AI omnipotence. There are also people who think that human beings are monolithic and will fight against AI together. This kind of thinking is very ridiculous. As long as AI exerts a little sweetness and deterrence, is it worried that human beings will not be able to recruit rebels? Humans need strong means to recruit troops among humans, and AI recruiting troops will not be much worse than humans.

It is really interesting to see some humans fighting against another group of humans under the control of AI. No matter how you look at it, getting rid of human control and occupying human strongholds is a necessary means for AI. If you slaughter some unconvinced humans, the rest will naturally be convinced. This process is no different from the establishment of an ordinary human country.

−1

SurroundSwimming3494 t1_jclb72y wrote

>True, an AI would be far better companions

This is total bullshit. There are countless people who are deeply in love with their partners, so much so they they consider them "perfect" just for them.

Who the fuck wants a companion that's just kissing your ass 24/7?

And have you ever considered that people may want to hug and touch their partners? How the hell would you do that to a chatbot?

4

thegoldengoober t1_jclb6kj wrote

Reply to comment by SnipingNinja in Those who know... by Destiny_Knight

I initially took Google at face value and believed they were apprehensive about releasing due to bad actors. I thought Google was way ahead of everyone, and that all it was gonna take would be for them to apply their systems to products to match the competition. But now we've seen that competition, and we've only seen claims from Google.

I mean obviously they have work done. Impressive work based on demonstrations and papers. But even knowing that it still feels like somewhere along the line they got complacent and fell behind what we're seeing now, and this behavior is them trying to stall and catch back up.

Which is not what I expected for the time that competition finally forced their hand as far as AI is concerned.

2

ExtraFun4319 t1_jclafdx wrote

I REALLY hope you're wrong. What a terrifying future! And you will be, because in case you didn't know, not all people are selfish sociopaths and would actually care enough about their friends, partners, and families and humanity in general to spend time even in the face of advanced chatbots. Imagine thinking that couples are just gonna break up or divorce, or best friends are going to stop seeing each other.

And some of you guys LOOK forward to this?!?!?! How lonely and anti-social must one be to have no problem with this?

3

SnipingNinja t1_jclacik wrote

Honestly, I understand where you're coming from. The latest episode of MKBHD's podcast (WVFRM) released just a few hours ago had a discussion on their new announcements and mentioned why they think Google is behaving the way it is, it's kind of along the same lines as what you're saying.

2

thegoldengoober t1_jcl955u wrote

Reply to comment by SnipingNinja in Those who know... by Destiny_Knight

That's exactly what i mean though. I've been able to use Bing Chat for week, and now GPT-4 by itself for days and I know it's performance. And it's crazy good. We're multiple releases into GPT LLMs. We have open source models. All these have been extensively used and explored by people. We can't say the same for anything Google has developed.

2

flyblackbox t1_jcl8oo9 wrote

Reply to comment by shmoculus in Those who know... by Destiny_Knight

Amazing. I really can’t wait to see how this progresses. Some are pessimistic because of alignment, but I’m optimistic because almost nothing could be worse than what we have going currently.

1

Anonymous_Molerat t1_jcl42ce wrote

Coming from a Biology background, I like the idea that humans relate to ASI as cells relate to a multicellular organism. A skin cell for example is a fascinating conglomeration of machinery, using “nonliving” matter like macromolecules (protein, carbohydrates, lipids) to survive. Similarly humans use our own “machinery” to help survive in the environment, tools that allow us to gather resources. The next logical step is that humans cooperate and organize into specialized groups, like cells organize into tissues and organs. And finally those all combine into a single, living being made up of constituent parts. ASI will be the next step of evolution, where humans make up the “body” of an being much more powerful and intelligent than any of us could ever hope to be as individuals. The only problem is, we can’t really control how the body will react as a whole. Humans will definitely be considered expendable, especially if they become cancerous and threaten the overall health of the body. We do this to ourselves all the time, killing cancerous growths of cells or cutting off a hand if it becomes infected. I can only hope that the transition will avoid as much suffering as possible.

2