Recent comments

Rottimer t1_jegwjx3 wrote

People complaining about this are ridiculous. It would rise to $17.25/hour next year. That's $35,880/year, IF you can get 40 hours per week all year long at a minimum wage job. In NYC, that already precludes you from living alone. That level income requires a roommate even in the shittiest parts of the city. And you're not going to be able to save shit - so good luck moving if you need to pay 1st month and a security deposit.

Don't worry, Starbuck's is already paying this much, so the increase won't be responsible for raising the price of your caramel macchiato. The price might go up anyway, but it won't be because of this.

79

HarbingerDe t1_jegwjgf wrote

>If I propose to end slavery in 1800s you’re objection to “who would pick the cotton!?” is not a rebuttal.

Typical right-wing / conservative move of, "uhh actually we're totally the ones who are against slavery... Yeah... It was us..."

The scenarios are not analogous at all.

>New horizons will be created. What they will be I cannot even begin to guess.

You are fundamentally at odds with the premise of the sub, this seems to be the biggest thing you're not grasping.

If you believe we're on the cusp of developing a self improving entity that is more intelligent, more creative, and all around more capable than a human at any given task then there cannot be any new horizons that an AI wouldn't better be able to take advantage of.

2

martianunlimited t1_jegwj6v wrote

ChatGPT (and other GPT3.5 based transformers) has 175 billion parameters and wouldn't even fit in a dedicated RTX4090. and before you say why not run just a smaller model, the performance of ChatGPT is highly dependant on it's size, (which is why people outside the machine learning community don't hear much about GPT-1 and GPT-2. And while there are efforts to make the model smaller (see ALPACA) , you would still need a top-of-the-line GPU to fit these smaller models, taking away from things that people are more concerned about, the graphics.

So the practical implementation of incorporating ChatGPT in to games would be to have send the chat response to a server, and suffer a whole lot of latency for the response. It's possible, but it wouldn't be a good gaming experience. Wait (at least) 10 years, when consumer grade hardware has the capacity of datacenter grade hardware (that's assuming we don't hit the end of Moore's law first) then you might find it more common place.

3