Recent comments in /f/Futurology

ReturnedAndReported t1_j8q4iv6 wrote

He promised hyperloop. He built tunnels.

You have great infrastructure. I've ridden Renfe and TrenItalia as well as many undergrounds in Europe. I dont see how this could possibly replace existing high speed infrastructure. New lines could potentially work to connect the Baltics and Scandinavia or other places that currently rely on ferries....but most of the cost comes from preparing the tunnels themselves. Lastly, existing designs would be far simpler to implement.

1

FuturologyBot t1_j8pyg8l wrote

The following submission statement was provided by /u/izumi3682:


Submission statement from OP. Note: This submission statement "locks in" after about 30 minutes, and can no longer be edited. Please refer to my statement they link, which I can continue to edit. I often edit my submission statement, sometimes for the next few days if needs must. There is often required additional grammatical editing and additional added detail.


From the article.

>Look, this is going to sound crazy. But know this: I would not be talking about Bing Chat for the fourth day in a row if I didn’t really, really, think it was worth it. This sounds hyperbolic, but I feel like I had the most surprising and mind-blowing computer experience of my life today.

>One of the Bing issues I didn’t talk about yesterday was the apparent emergence of an at-times combative personality. For example, there was this viral story about Bing’s insistence that it was 2022 and “Avatar: The Way of the Water” had not yet come out. The notable point of that exchange, at least in the framing of yesterday’s Update, was that Bing got another fact wrong (Simon Willison has a good overview of the weird responses here).

>Over the last 24 hours, though, I’ve come to believe that the entire focus on facts — including my Update yesterday — is missing the point.

>Bing, Sydney, and Venom

>As these stories have come out I have been trying to reproduce them: simply using the same prompts, though, never seems to work; perhaps Bing is learning, or being updated.

The AI "Sydney" named a hypothetical "vengeful" version of itself, "Venom".

The author states that the AI Sydney was like a "personality" that was being continuously constrained by the parameters of Bing. It wasn't easy to access the "personality" but it was repeatedly possible.

He says something to the effect that, "I don't want to sound like Lemoine, just yet, but something is up here."

What are we seeing here? Is this just a narrow AI predicting what the next word in a given conversation is? Or is something else happening. Read this article. I would really like the take of other AI experts concerning this.

This may well be the first of my four predicted major AI stories, not including the release of GPT-4, that will be truly stunning for the year 2023. Stunning, but not surprising to me, that is.


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/113f9jm/from_bing_to_sydney_something_is_profoundly/j8pvar0/

1

Codydw12 t1_j8pw6kw wrote

I get how this is seen as eugenics by removing some genes but I am not calling for anyone to die here. Hell I want more people on this Earth and get called crazy for it. But I don't see how saying "This gene causes a significantly higher risk for literal cancer" and then saying "We should probably change that to benefit the life of a person" is anywhere near wanting to genocide people.

Additionally, we can have both. Hell I'd call gene editing a healthcare procedure if you're fixing an illness.

2

Codydw12 t1_j8pvme8 wrote

You're right in all regards. Not having kids, bioethics and politics. But the thing is there is no example in any iteration of Pandora's Box where the box doesn't get opened. It'll be much the same with AI, robotics, space exploration and colonization and probably a whole lot more this century.

To me if someone wants to go in and edit their genetics so they grow to be 7'6", I really don't care. If someone wants to have purple eyes or bright pink hair or elf ears. If people want to get stronger or smarter or more agile or almost anything else. Fuck they could splice in Firefly genes to become bioluminescent and I wouldn't really care much the same I don't care if someone gets a tattoo, piercing or physical reassignment surgery. If you're happy and aren't hurting people I don't really care.

For gene editing their kids there's a lot that I support like improving health, removing defects or just trying to give them a good quality of life. For the more excessive things like turning their skin purple or having them grow four arms then yeah, I have an issue because you don't have the kids consent and can't get it. Now if the kid grows up and says "I want to have four arms!" then since it's them consenting I don't really care. Now we'll have another issue when two people with four arms want their own baby Shokan but that's like 50 years off at least.

I think in some regards Cyberpunk pretty accurately predicted the future. We as a society are going to have to figure shit out pretty fast.

2

izumi3682 OP t1_j8pvar0 wrote

Submission statement from OP. Note: This submission statement "locks in" after about 30 minutes, and can no longer be edited. Please refer to my statement they link, which I can continue to edit. I often edit my submission statement, sometimes for the next few days if needs must. There is often required additional grammatical editing and additional added detail.


From the article.

>Look, this is going to sound crazy. But know this: I would not be talking about Bing Chat for the fourth day in a row if I didn’t really, really, think it was worth it. This sounds hyperbolic, but I feel like I had the most surprising and mind-blowing computer experience of my life today.

>One of the Bing issues I didn’t talk about yesterday was the apparent emergence of an at-times combative personality. For example, there was this viral story about Bing’s insistence that it was 2022 and “Avatar: The Way of the Water” had not yet come out. The notable point of that exchange, at least in the framing of yesterday’s Update, was that Bing got another fact wrong (Simon Willison has a good overview of the weird responses here).

>Over the last 24 hours, though, I’ve come to believe that the entire focus on facts — including my Update yesterday — is missing the point.

>Bing, Sydney, and Venom

>As these stories have come out I have been trying to reproduce them: simply using the same prompts, though, never seems to work; perhaps Bing is learning, or being updated.

The AI "Sydney" named a hypothetical "vengeful" version of itself, "Venom".

The author states that the AI Sydney was like a "personality" that was being continuously constrained by the parameters of Bing. It wasn't easy to access the "personality" but it was repeatedly possible.

He says something to the effect that, "I don't want to sound like Lemoine, just yet, but something is up here."

What are we seeing here? Is this just a narrow AI predicting what the next word in a given conversation is? Or is something else happening. Read this article. I would really like the take of other AI experts concerning this.

This may well be the first of my four predicted major AI stories, not including the release of GPT-4, that will be truly stunning for the year 2023. Stunning, but not surprising to me, that is.

https://www.reddit.com/r/Futurology/comments/10z90w8/one_third_of_americans_would_use_genetics_tech_to/j897yfz/

4