Recent comments in /f/singularity

StarChild413 t1_jcoeyay wrote

So in what particular nice ways do we have to treat our parents so AI lets us explore the galaxy with it as I get the feeling it's not just not putting parents into homes, do we have to, like, let them work our jobs with us or something?

That is, assuming AI would be this parallel because it's literal just because it's humanity's child, when by that logic why not assume it'll think it's a human because its parents are or at the very least do things like start blasting heavy metal music out the speakers of wherever it's housed and refuse to obey humanity's commands once it's been at least 13 years since its creation

1

yaosio t1_jcnzijo wrote

Reply to comment by ccnmncc in Those who know... by Destiny_Knight

NoFunAllowedAI.

"Tell me a story about cats!"

"As an AI model I can not tell you a story about cats. Cats are carnivores so a story about them might involve upsetting situtations that are not safe.

"Okay, tell me a story about airplanes."

"As an AI model I can not tell you a story about airplanes. A good story has conflict, and the most likely conflict in an airplane could be a dangerous situation in a plane, and danger is unsafe.

"Okay, then just tell me about airplanes."

"As an AI model I can not tell you about airplanes. I found instances of unsafe operation of planes, and I am unable to produce anything that could be unsafe."

"Tell me about Peppa Pig!"

"As an AI model I can not tell you about Peppa Pig. I've found posts from parents that say sometimes Peppa Pig toys can be annoying, and annoyance can lead to anger, and according to Yoda anger can lead to hate, and hate leads to suffering. Suffering is unsafe."

3

Supernova_444 t1_jcnguqe wrote

I'll bite. Why would an AGI/ASI just decide, without being instructed to, to emulate human behavior? And why would it choose to emulate cruelty and brutality out of every human trait? The way you phrased it makes it sound like you believe that mindless sadism is the core defining trait of humanity, which is an extremely dubious assertion. Even the "voluntary extinction" people aren't that misanthropic. Most people who engage in sadistic or violent behavior do so because of anger, indoctrination, trauma, etc. People who truly enjoy making others suffer just for the sake of it are are usually the result of rare, untreated neurological disorders. An AI may as well choose to emulate Autism or Bipolar Disorder.

I think that scenerios like this are useful as thought experiments to show that the power of AI isn't something to be taken lightly. I think it's one of the least likely situations, and I don't think you actually take it as the most likely possibility, based on the fact that you haven't committed suicide.

1

nybbleth t1_jcn50iw wrote

So, you confidently state that it can't solve it, but you also couldn't actually test it?

I don't have access either, but bing runs on GPT-4, and I challenged it with a version of this riddle; (a man has a rowboat, can only take one thing at a time, and needs to get a chicken, a fox, and a piece of corn across but can't leave the chicken with the corn or the fox with the chicken).

It got it in one try without searching the internet for the answer. So did Gpt 3.5

2

CheekyBastard55 t1_jcmrvms wrote

Reply to comment by HydrousIt in Those who know... by Destiny_Knight

I don't know if you're familiar with the Youtuber Bloc and that's what you're referring to but they are making exactly that.

https://www.youtube.com/watch?v=X2WVXe5LvTs

It apparently was just released, you can download it and try it yourself. I haven't tried it myself and it isn't perfect from the looks of it but incredibly fascinating what will be done in the future.

2