Recent comments in /f/MachineLearning

tripple13 t1_j70wvid wrote

History has shown what happens at technological breaking points. Yes, you may not want to earn a living as a horse carriage chauffeur, however, there are opportunity to become a car chauffeur.

I think your premise is wrong, it’s not about replacement, it’s about evolution.

It’s not about ‘threatening’ jobs, but improving certain aspects of it.

3

2blazen t1_j70ux9o wrote

I thought so too, but haven't actually notice any difference, other than how the davinci models don't have the extensive content filters.

>if you use it for work, $20 is negligible

If my company pays for it, sure, otherwise I'll always prefer the request-based pricing with a nice API that I can just call from my terminal

1

gamerx88 t1_j70rs5v wrote

Without referring to the paper again, my intuition is that a pairwise loss over final outputs does not gel well with how the model is auto-regressively generating the text.

Generation with GPT is basically a token by token decoding process with the previous time steps taken into account. Think about the difference between a supervised learning problem vs reinforcement learning. The former ignores the step-by-step nature of the generation scheme, and is a poorer fit for a decoding problem.

1

besabestin t1_j70qxf2 wrote

Most clients don’t even know what software they need. Software engineering isn’t entirely about writing code. You have to solve the problem of a client, which sometimes the client can’t clearly articulate. Perhaps there would come a time when programmers would write a large project assisted by an AI in a very short time.

11