Recent comments in /f/Futurology

theironlion245 t1_j9j40ki wrote

First of all, if my Chevy Bronco had feelings I could make love to it, and I find that beautiful. Second, chatgpt is an AI, we're talking about AI having feelings, I needed to illustrate an example, ipso facto 1+1=2.

1

Poly_and_RA t1_j9j3p8p wrote

Yepp. And the other killer features they tend to brag about are similarly dumb and/or already covered by better options.

Hang out with your long distance friends they say.

But here's the thing: I've already been playing games with long-distance friends in virtual worlds for over 2 decades. There's nothing new in this. World of Warcraft came out 20 years ago, and it's been 45 years since the first MUDs came online.

VR plays no role worth mentioning in this. Advertising and wild claims notwithstanding playing WoW in VR isn't more compelling than playing it on a plain old monitor.

Converse with your friends they say.

But for this my main wishes are things like high-quality video and audio with a minimum of lag, stuttering or other quality-issues. And VR doesn't actually help with that in the slightest. No I don't really care whether I can "walk around" my friend that I'm talking to -- but I do care that the audio-quality is good and that the picture doesn't freeze.

It's possible that some killer use for VR will be found at some point. But this far I've seen nothing compelling.

1

Poly_and_RA t1_j9j300v wrote

We've been doing this for a long time already. 50 years ago complex factories were built as scale-models first in order to detect problems before construction starts on the real factory. Today (and for the last couple decades) we use digital models instead.

But VR and "the metaverse" play essentially zero role in all of this. 99% of it happens on ordinary flat 2D computer-monitors.

1

rckrusekontrol t1_j9j1lgr wrote

I can think about this forever and never get anywhere- What makes emotions real? If we put receptors on an AI and programed damage=pain pain = insufferable, how would it actually differ from what our brains do? Does our pain exist as something tangible, or does pain not exist at all, aside from our brains telling us that nociceptors mean bad time?

We know an ant doesn’t feel the same pain as us (if any, pain is pretty much an emotion) but a dog is some where in between- is there a line when pain becomes real, tangible, significant?

5

superjudgebunny t1_j9iykt0 wrote

Instruction sets are just ways for us to allow it to do tasks faster. All those years using AI learning to get to facial recognition, voice recognition, patterning. We were creating shortcuts, a lot of them.

You don’t need instruction sets, it could all be done without them. It just makes this faster, need less overhead.

What we have been doing is creating those types of instructions and making them faster. What do you think instruction sets are? Task specific operations for a specific architecture.

We are living the cumulation of AI instruction sets to create an overlaying architecture. Learning to apply these different AI assisted learned skills into a more robust arch.

No it’s not really an arch, more like includes in C++. Eventually a lot of that work will become hardware. Think of encoding codecs, we started moving that to hardware and away from software.

The server farms are playing a huge role in this. The they are the uarch branches that currently are being formed. As we develop and merge these, eventually we will get to developing consciousness in some form.

1

Sikkus t1_j9iyasi wrote

That's an excellent question for people, too. Childhood trauma can affect the brain such that some emotions can't be felt, for example empathy. If the adult cares enough to listen to feedback from others, he might start believing that he is empathic. He might say the right words as reply to sadness but the real feeling isn't there inside.

Source: I'm going to therapy for this exact thing.

1

scummos t1_j9iy72c wrote

> I actually think it has the possibility of swinging the other way (at least in some areas).

I think that's without alternatives honestly. The internet long has suffered from a problem with algorithmically curated, if not outright generated, content now. This time will come to an end, because already now googling things tends to yield heaps of garbage to sift through to get to the one good piece of information, and it will get worse quickly with all this AI tooling available.

I guess people will turn back to reading their favourite blogs and websites more (or the more modern counterparts realized in Instagram or whatever -- same concept, you look at content created or curated by a specific person), and explore through what people they trust point them to. Which is probably a good thing, since exploring the algorithm-curated landscape was usually not particularly great. (I'm looking at you, "Youtube related videos".)

I think the algorithmically curated landscape (e.g. Google search) will retract more and more towards just buying/selling things, because that's a transaction with real money and real things attached, which cannot plausibly be fudged by AI.

4

Zer0pede t1_j9ix7nj wrote

Yeah, before if a bad writer wanted to submit something, they’d actually have to take the time and effort to write it. That slows them down and weeds out the lazy ones. Now they just have to write a prompt. Nothing to slow them down and nothing to weed out the laziest. Having to read the first several paragraphs of hundreds of submissions just sounds miserable—literally more work than it took them to “write” it. I would absolutely ban everyone who wasted my time like that.

8

69inthe619 t1_j9ix0tn wrote

this is not a philosophical, scientific, or even a debate if you consider organic and inorganic matter to be all the same. if it was the same, there would not be a difference, but there is. by refusing to acknowledge that there is anything special about life which we are yet to find anywhere else in the universe, you have oversimplified to the point where you are trying to recreate the mona lisa by finger painting. a $50 bag of inorganic matter from an abandoned radio shack clearance bin can not be magically converted by an inorganic ai to have the necessary qualities of organic matter while remaining inorganic and spontaneously producing feeling, emotions, and lets not forget the leap you take by jumping from the simple input of the data to generating sentience. are you going to say the only thing sentience requires is the same data that emotion requires? if you think that, congratulations, you know absolutely nothing about quantum physics where everything exists as only a probability wave. it is there, where quantum probability intersects with the organic matter of your brain in the physical world, where sentience is possible. sentience is not something that came to be just for fun, life does not expend energy making unnecessary things like five legged humans, it evolves what it needs to survive out of sheer do or die necessity. and sentience is absolutely essential in order to navigate a universe where at the foundation, there is only a probability of something, and there is also a probability it is not. you better be able to handle any outcome simultaneously so you can do basic things, like make a split second decision that will decide if you escape certain death and live to reproduce, or die. the only thing that life can not afford, death before reproduction. in that context, sentience is the only thing separating successful reproduction and the continuation of the cycle of life and death before reproduction and the end of life. but hey, sentience will just happen if you read enough books, right?

and no, a quantum computer does not recreate the quantum mechanics with organic matter in your brain so that is nit a save for you. also no, quantum computing is not the answer to every problem everywhere because there is an entire set of mathematical problems that do not have a single answer because there is an “infinite” number of pathways to get there so computing the answer is not even possible to begin with (ie: the traveling salesman). infinite makes computers compute infinitely, and that means no soup for you or your ai. this doesn’t take into account all of the equations and constants baked into the universe that prevent the universe from being computable in any practical way. think pi. 3.14…. an infinite number that repeats infinitely across the universe in every single sphere in existence. and gosh darn it, there is that infinite thing that makes computers compute infinitely. now you have to start all over, again.

if acting like it and being it are no different, then han solo is a living breathing actual han solo thinking human being and not harrison ford because i have seen the star wars movies and he is acting like it is real so by golly, it is just as real as reality just like the ai acting like it has feelings is just as real as having feelings. who taps out in a pain endurance contest between you and your ai that read about pain in a book so it can provide a pained response to a hypothetical pain? i imagine you have a .001% chance of outlasting the ai, but not because of you, or even the ai, you get that .001% because the quantum element to this universe makes it impossible for there to ever be a 100% certainty of any future outcome. but hey, at least you have sentience, just like that bag of scrap from radio shack (once you get around to watering it with data).

1