Recent comments in /f/singularity

Lawjarp2 t1_jb66hv1 wrote

The things that can slow it down are already in motion but they can only push it down so far.

(1) A recession causing a drain on the companies trying to build AI. A recession is here.

(2) A war or other critical event causing interest rates to go high, leading to defaults in startups and even established companies. Interest rate will go all the way to 6% this year.

(3) Hardware/cost limits being hit. Better hardware will ofcourse be available soon but it's harder now to just scale by pumping money. It's already reaching hundreds of millions of dollars, more is only possible by governments or high return rate on these AI models.

(4) Isolation of a large country like China from chip manufacturing and procuring for AI.

Other things that could happen

(*) GPT-4 being a bust and thereby eroding confidence.

(*) OpenAI and other companies fail to monetize.

(*) Scaling may have reached it's limits. Newer architectures take time.

But even with all this, it can only slow it down by 5-10 years. We will still likely have AGI in 2030s.

31

CertainMiddle2382 t1_jb61410 wrote

We are nearing self improving code IMO.

Once we get past that, we have crossed the threshold.

Seeing the large variance in the hardware cost/performance of current models, Id think the progression margin for software optimization alone is huge.

I believe we already have the hardware required for one ASI.

Things will soon accelerate, the box has been opened already :-)

48

jungleboyrayan t1_jb608cr wrote

ASML a Dutch company that provides chip making equipment. They are the leading one. they agreed with the USA not to sell equipment to China etc.. this will put these countries 10 years behind in development of super tiny chips.

1

94746382926 t1_jb5xztj wrote

Man this sub can be something else sometimes. The thread is literally asking for what people think could slow things down and you get downvoted for stating your opinion lol. People need to lay off the hopium a little bit.

To be clear I don't even really agree with your opinion (I think LLM's could possibly see a slowing of improvement soon, but think they will be quickly replaced). Regardless, we should want dissenting opinions, especially when we're asking for them.

16

ihateshadylandlords t1_jb5ktsj wrote

We still haven’t been able to get through the bottleneck that is R&D and making products available to the masses once proof of concept is established. I see a lot of posts on here that involve proof of concept for great products. But they still have to test the products to make sure they don’t malfunction over a period of time. The products also have to be at a price to where the average person can afford them. A lot of things here will get shelved because they’re either not able to get the price down or it malfunctions too often and they can’t fix it.

I think it’ll be a long time before we can accelerate/refine that part of the production process.

10

freeThePokemon246 t1_jb5fb9l wrote

I foresee a dead end in LLMs. Their core limitations are plainly visible, if one takes of their hype glasses. Once the spark of hope that is the LLMs wink out of existence we shall once again be back in a hopeless AI darkness. Maybe the next generation after us will be luckier.

1

DungeonsAndDradis t1_jb5ek3g wrote

According to history, this will only accelerate (towards extinction, I think).

To answer your question, the only thing that would slow down AI research is a large scale, civilization-affecting issue. Massive meteor strike. Deadly plague. Nuclear war. CME (coronal mass ejection) that takes us back to the 1800s.

42