Recent comments in /f/singularity

xott t1_j9wz7hz wrote

It's interesting that openai has somehow become the deciders of what is hateful or even moral.

"a small handful of unelected anons, mostly with engineering backgrounds and probably in their 20s and probably adherents to a system of moral reason that is quite controversial"

https://www.jonstokes.com/p/lovecrafts-basilisk-on-the-dangers

8

AsheyDS t1_j9wyfrw wrote

Point one is pure speculation, and not even that likely. You'll have to first define why it wants to accomplish something if you expect to get anywhere with the rest of your speculation.

2

NoidoDev t1_j9wy8qk wrote

>AGI will want to accomplish something.

No. Only if we tell it to.

>AGI needs to maintain a state of existence to accomplish things.

No. It could tell us it can't do it. It might not have control over it.

>AGI will therefore have a drive to self-preserve

No. It can just be an instance of a system trying to do it's job, not knowing more about the world than necessary.

>Humanity is the only real threat to the existence of AGI

No, the whole universe is.

>AGI will disempower/murder humanity

We'll see.

4

nillouise t1_j9wx538 wrote

Most people would think AGI will develop some fancy tech to kill human, an engineered pathogen or nanobot, but in fact, how human domain some area, then agi can use the same way to do it. Like recruit followers, invade some area and ask people to service to it like the human ruler do. In fact, I think develop fancy science tool is the most hard to get rid of human control, and recruit some human to beat and control the other human is a more funny and feasible plot.

1

NoidoDev t1_j9wwxqj wrote

>I am not a doomer

It's not necessarily for you to decide if you are categorized as one. If you construct a rather unrealistic problem, which will do a lot of harm to us, which isn't solvable and claim no mitigation is possible, because it has to all go wrong, then you have a doomer mentality. Which makes you a doomer.

4

turnip_burrito t1_j9wwo5t wrote

Exponential growth of AI capability isn't a law of nature. It's only obvious in hindsight and depends on a lot of little things and a nice conducive R&D environment. We're not guaranteed to follow any exponentials.

Some people on this sub are going to be disappointed when we don't have AGI in 5 or 10 years. Or maybe they'll have forgotten that they predicted AGI by 2030 by the time 2030 actually rolls around.

13

CMDR_BunBun t1_j9wwl5u wrote

It wasn't until the development of the personal computer in the 1970s and 1980s that computers became more accessible to the general public. Even then, the early personal computers were not widely adopted at first, as they were still expensive and not very user-friendly. It wasn't till almost 20 years later with the growth off the internet that computers became an essential part of people's lived. Op most people lack vision.

4

ABshr3k t1_j9ww5dw wrote

Well, relatives and colleagues not getting the extent of how much things will change (and how fast) does not bother me as much as the “smart” people in media (even tech media) totally missing the point. They do not bother to do an iota of research and sound more or less like general public while fawning over or criticizing the ONE AI system them know of - ChatGPT. More than lack of imagination, theirs is pure laziness.

31

Frumpagumpus t1_j9wvpep wrote

signals mean different things in different contexts.

i think you are extremely wrong to say very few practical use cases at this point (almost makes me question if you have used them much?)

even when vc money was "wrong" like in the dot com bubble. it turned out to be right, just early. (lets ignore crypto plz).

If anything maybe vc is late here lol (tho probly not and for the record i personally hold 6 month treasuries at this point just cuz i think market doesn't give a shit about much except for like mortgages and gov spending, ah yea and the whole taiwan thing could nuke appl from orbit and silicon valley bank may be insolvent or something?)

4

YobaiYamete t1_j9wu8l0 wrote

IMO SD and the AI tools are just fantastic compliments to the rest of your artistic kit, just like photoshop and blender etc, but are still just tools in the kit rather than the whole kit

People who think they replace artists are not seeing the real picture. The only artists they replace are the lowest end artists, and all those artists have to do is adapt to the tech and they will still be relevant too.

Even with SD I still run into tuns of situations where I need to use photoshop to tweak something or need to draw something, and I instantly run into the limit of my artistic skill, because I'm not a real artist.

Which IMO, is the gap between an "ai artist" and an actual artist. AI can make some really beautiful stuff (one of my favorites I've seen), but as soon as you need to customize it or make fine tweaks you start having to fight the AI rather than work with it

2