Recent comments in /f/Futurology

kharlos t1_j9slg66 wrote

We don't even apply this logic to many animals which are undoubtedly sentient, can suffer, and feel pain. We share recent common ancestors with many of these species and share zero with AI.

I'm not against granting AI rights in the future, but many animals will need to be granted rights before then, imo. I just think it's funny we're so anxious to treat something which feels no pain has no sentience (at least for a long time from now) with respect and as an equal when we are absolute monsters to everything else living on this planet.

Let's first treat humans and everything that suffers with some BASIC respect before moving on to the mental gymnastics required to do the same for language models.

3

NorCalBodyPaint t1_j9skue6 wrote

I think we need to change things drastically.

De-emphasize calculation and fact regurgitation- machines can do that better and they are ubiquitous

Emphasize critical thinking X100- so we can discern good from bad with more accuracy

Emphasize Arts and Communication skills- so we can work better in social environments

Emphasize Social Skills and Psychology- so we can understand our own motivations and those of others

Learn the basics of rhetoric, so that one can tell when others are using rhetorical tricks to manipulate others.

1

Mash_man710 t1_j9skr0m wrote

They. The government? Somebody in the government is contacting private businesses to get staff back in the office? Absolute bollocks. The key reason is the level of investment, debt or long term leases that organisations have in commercial real estate that is being underutilised. Not rocket science and not a conspiracy.

−2

spudmix t1_j9skf49 wrote

My bet's on some kind of auto-ML tool which allows organisations to feed data, specify targets, receive predictions.

We have things like that in industry already but I think the burgeoning capabilities of things like Codex and ChatGPT means that we might now be able to have an AI build the AI in a much more intelligent way than before.

tl;dr Skynet

1

7grims t1_j9sk48d wrote

Seems to be a plethora of solutions:

- Make home work or home projects questions that include a array of information that forces the student to read it properly and engage with the information, regardless if that question be inserted in a AI query.

- More tests in class, without the help of phones or pc's

- More engaging classrooms, to see who is paying attention and if the students are understanding

- More hand written stuff, paper and pencils, to make sure the students read what they write

- etc

Seems this is beneficial to education actually, for years studying was being in class day dreaming, and afterwards being forced to go to a library or the net to get all the info u ignored while in class.

AI presents a good opportunity to reform education and improve it to even better that it ever was.

1

LettucePrime OP t1_j9sine9 wrote

Oh no that seems a bit silly to me. The last 15 years were literally about our global "store-everything" infrastructure. If we're betting on a race between web devs encoding tiny text files & computer engineers attempting to rescale a language model of unprecedented size to hardware so efficient it's more cost effective to run on-site than access remotely, I'm putting money on the web devs lmao

1

Asleep_Barracuda4781 t1_j9si4gm wrote

I don't know enough about neuroscience to say anything of meaning. I would assume you would have to articifically connect neurons.

As someone else has pointed out in the comments. Having knowledge of something does not equate to understanding it, let alone being able to implement your understanding. You would have to be able to download the understanding and muscle memory that comes with experience, training, and practice.

If the AI can literally reprogram your brain by reconnecting individual neurons...at what point are you just an organic robot robbed of all agency and a mere play thing of the AI?

1

LettucePrime OP t1_j9sho3d wrote

EDIT: I am so sorry this is long as shit & it ends on a downer. It's just a really morose & unpleasant read.

Later in the thread I used a better comparison: Wolfram Alpha is not used to teach pre-calculus. 4 function calculators are not used to teach basic arithmetic. We gate a student's "generative ability" based on the skills we want them to develop. Trigonometry does not measure a student's ability to draw a sine function, but rather their ability to represent, measure, & manipulate one. The robot can draw the line to match your function, that's the easy part. Making sure your function is correct is the part you need to learn.

The essay is the function, not the line. It is the proof of the struggle with something new that will produce necessary skills for development. At the very least, it's proof that the user can read a new thing & generate a cogent output from it, which is such an impressive accomplishment in nature that teaching it to machines has caused significant economic & social disruptions.

It's evidence of a user's ability to interrelate information - a process so complex it must be done essentially from scratch every time the user alters even one parameter of their data set. Where mathematical reasoning, at least elementary math, linearly grows in complexity, allowing students the ability to compress portions of the process generatively, no such linearity exists in any other discipline. No one in studying Faust is saying: "I learned about 17th century English Literature last year. I'll just plug Paradise Lost into the machine to return a comparison between Milton & Goethe's portrayals of aberrant desire"

Lastly, it's evidence of the user's ability to communicate, which can be considered a complex test of metacognition, a much simpler test of the arbitrary constraints of syntax, & a gauge for how fulfilling the experience was for the user. At the end of the day, that is what it's about.

We need people to have all of these skills. Many of them are difficult to learn. Most of them overlap with ChatGPTs advertised features. We are asking our education system to revolutionize itself in response to a new toy in an extremely short time while extremely underfunded & extremely overtaxed. This is a recipe for a goddamn catastrophe.

You asked what the actual fallout of the last several decades of neglecting liberal arts education has been, &, if I may be perfectly frank, I think it's produced a fucking wasteland. Our industries are corrupted by a revitalized mercenary fetish for cutting overhead & maximizing dividends at a human cost. Our public gathering places are being bulldozed & replaced with more profit-sucking real estate. Our actions are monitored, dissent is catalogued, & punishment is divvied out on an industrial scale. When it happens to us, so often we are incapable of placing it in a larger context. When it happens to others, we struggle with our incomplete grasp of empathy & susceptibility to reams of misinformation. All of this, helmed by engineers, computer scientists, lawyers, entrepreneurs, politicians, & citizens simultaneously over & under-educated.

I have a personal example. My dad held a degree in Nuclear Engineering & had nearly 30 year's experience in systems analysis, quality assurance, continuous improvement & adjacent managerial disciplines in the Energy, Aerospace, & Manufacturing industries. He died a year & a half ago. The disease was systematized ignorance. Delta variant was just a symptom.

2

ShowNudesForLove t1_j9sgst9 wrote

To learn from an AI like this, you have to have an intrinsic desire to learn something new. You want to know something, so you ask the ai. You care about what it has to say because you actively want to know the answer.

In general, the majority of students k-12 in the US so not possess this intrinsic desire to learn for all or even most of their courses. Good teachers find ways to build relationships with the students and build connections from the content to the students' lives in order to help foster that intrinsic desire.

That's a piece that will be extremely difficult to recreate because of how individualized it is.

Take for example a student who wants to know more about physics. They ask an AI to explain some concept. If the explanation is too high level, they'll then need to ask the ai to explain the pieces of it to them and break it down until they can build up their understanding. This requires a kind of meta cognition that most students don't develop easily. And at each step where they don't understand and need it broken down further, it's another block causing them to reconsider how much they actually care about learning the information in the first place.

For upper level students or very motivated individuals, I think ai could potentially get there. But for the majority of schooling at primary and secondary levels, I think we are going to need teachers for a very long time.

11

The_Observatory_ t1_j9sf7hb wrote

What happens when people aren't educated enough to know which questions to ask the AI, and aren't educated enough to evaluate whether the given answers are accurate and relevant? The purpose of our education system isn't merely to give you answers to questions; you can already get that from Google. It is intended to teach you how to think.

3

fitblubber t1_j9sey1r wrote

Issues:

you have to find the minerals you want

you have to get to them & mine them

You have to smelt them

You have to turn that lump into something useful

If you want to move it to anywhere else in the solar system you have to move it out of the gravity well that is the moon

& then there's moon dust - it's extremely toxic

Sure, give it a go & see what happens, but I'd rather be mining asteroids :)

2