Recent comments in /f/technology

AbstractEngima t1_jac06su wrote

How is this even possible? Anyone with a brain knows that ChatGPT is nothing more than a unreliable narrator that pulls random bits of information and then puts into inaccurate mashed up information.

It's already basically following the same process as any other AI does, which is taking little bits of existing information and puts it together based on patterns, rather than actual understanding of the source material.

8

bigfatmatt01 t1_jac05jz wrote

Spellcheck and Grammar check both do the work that an editor would do in the proofreading step. When we were in school that step was done by another student to teach how editing works. I have no problem with those tools. Chatgpt replaces the process of turning ideas and thoughts into written word. That I have an issue with.

1

despitegirls t1_jaby6w1 wrote

I'm trying to understand that myself. Perhaps if you used it to summarize work that you created? I can't see trusting it as a source for information since it doesn't provide sources to where it has learned information, at least by default. This is something that Microsoft's implementation in Bing actually does.

15

slantedangle t1_jabxyi0 wrote

>If you're dealing with grammar, you're 99.9% doing it in the digital world and can easily fix it with assistance with no drawbacks.

>If you're in the casino wondering what's the chance of rolling snake eyes, you're unlikely to be pulling out your phone, so I'd say basic math is much more important than perfect grammar without assistance.

So you think students should learn math in case they want to gamble? That's your best argument?

1

C0rn3j t1_jabxicf wrote

If you're dealing with grammar, you're 99.9% doing it in the digital world and can easily fix it with assistance with no drawbacks.

If you're in the casino wondering what's the chance of rolling snake eyes, you're unlikely to be pulling out your phone, so I'd say basic math is much more important than perfect grammar without assistance.

0

Ok-Lobster-919 t1_jabwyb5 wrote

I don't think anybody fully understand the risks of not learning. I read a study about a reduction in neuroplasticity in people who relied on a GPS to traverse their surroundings, like a weakening of the ability to learn. I wouldn't be that surprised if neuroplastic changes continued with the failure to learn spelling. Is spelling not just mapping words to their correct spellings in the brain?

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6020662/

Here's a study, there could be a lot more work done but I think it's interesting.

Have people thought of the implications of replacing more and more spatial logic centers of the brain with tools like Chat GPT? Fun stuff to think about! Maybe in the future people will be classified as "learned knowledge clasically" and "has chat gpt".

−2

slantedangle t1_jabwimo wrote

Why would anyone be allowed to quote Chatgpt in their essay?

What value would a teacher see in quoting a chatgpt for their student? How does quoting a Chatgpt improve education?

I can possibly see using it to get a summary, for ones own reading compression on a topic. But not as a source to quote from for your essay. It's built on top of language models. Essentially, it mimicks our writing. Depending on what you feed it, "sometimes good, sometimes like shit."

46

SwagginsYolo420 t1_jabvmm9 wrote

Art depends on the intent, does it not?

I think the concept of craftsmanship is what is the issue here. As in, somebody can spend years mastering a specific creative technique with a lot of study and practice, and now somebody else can just come along now and press a few buttons to generate a very similar result. It's a lot to think about.

Craftsmanship is not required for creating art, though it is arguably a preferred ingredient by many who appreciate art.

4

A-Delonix-Regia t1_jabvfaa wrote

Reread my top level comment. I said it is okay to use while learning. In my opinion, if you use such tools in graded assignments, you are hiding your own lack of knowledge and fooling yourself.

I mean, graded assignments are a flawed concept in education, but there is no real benefit in using tools like AI there since they do not reflect your own knowledge, unless you use the AI to do research like get a quick summary of what nihilism is about and then rephrase the output in your own words.

−3