Recent comments in /f/Futurology

FuturologyBot t1_j9hs8pi wrote

The following submission statement was provided by /u/0neiria:


From the article:

"ChatGPT is best known as an AI program capable of writing essays and answering questions, but now Microsoft is using the chatbot to control robots.
On Monday, the company’s researchers published a paper on how ChatGPT can streamline the process of programming software commands to control various robots, such as mechanical arms and drones.
“We still rely heavily on hand-written code to control robots,” the researchers wrote. Microsoft’s approach, on the other hand, taps ChatGPT to write some of the computer code. "Have you ever wanted to tell a robot what to do using your own words, like you would to a human? Wouldn’t it be amazing to just tell your home assistant robot: 'Please warm up my lunch,' and have it find the microwave by itself?" the researchers ask."

Microsoft researchers have put out a new work in which they "extended the capabilities of ChatGPT to robotics, and controlled multiple platforms such as robot arms, drones, and home assistant robots intuitively with language." They claim that this approach empowers even non-technical users to work with robots, and usher in a new paradigm for robotics that integrates natural language very deeply.


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/118ktux/microsoft_researchers_are_using_chatgpt_to/j9hohry/

1

DomesticApe23 t1_j9hr70b wrote

That's not even new. People have been finding meaning in sunsets and the sound of babbling brooks for millennia. People already assign meaning to nonsense, are unable to distinguish bullshit from meaning and Rupi Kaur is a famous poet. You can generate trite verse with ChatGPT right now that is just as meaningful as her banal nonsense, and if you market it right people will lap it up. What's the difference?

It's not an intrinsic property of the work you're talking about, it's perceptions. Right now ChatGPT sucks at creating fiction, not because 'it still doesn't understand'. It will never understand. But all it has to do is complexify its model enough that it encompasses longer forms. All that takes is raw data.

I don't really know what you mean by 'actual art'.

2