Recent comments in /f/MachineLearning

dustintran t1_j8bdcv6 wrote

r/MachineLearning today has 2.6 million subscribers. The more influx of newcomers the more beginner-friendly posts get upvoted. This is OK—don't get me wrong—it's just a different setting.

Academic discussions were popular back when there were only 50-100K. In fact, I remember in 2017 being in OpenAI offices and every morning, seeing a row of researchers with reddit on their monitor. Discussions mostly happen now on Twitter.

56

machineko t1_j8b0zyv wrote

Are you interested in reducing the latency or just cutting down the cost? Can you run the workload on GPUs instead?

For BERT-type models, doing some compression and using inference libraries can easily get you 5-10x speedup. If interested, I'd be happy to share more resources on this.

1

ArnoF7 t1_j8azbzj wrote

Discussion in this subreddit is always a bit hit and miss. After all, reddit as a community has almost no gate keeping. While this could be a good thing, there are of course downsides to it.

If you look at this post about batch norm, you see that there are people who brought up interesting insights, and there are a good chunk of people who clearly have never even read the paper carefully. And this post is 5 years ago.

81

berryaroberry t1_j8aveyl wrote

The following is my opinion; so bias is there. My feeling is the sub was never about academic discussions per se. The papers and academic discussions acted like vessels to carry people towards "(deep learning hype + money flow+ industry jobs)" island. In most of the earlier discussions ,if you follow them closely, you will see that there was never really a push for genuine understanding, rather people looking for easy way to earn "publication currency". Initial impression was having some kinda project or publication could land people a high-paying job. Probably later people realized that actually they don't need to worry about papers and stuff, rather doing some kinda quick LLM based project will help to land high-paying jobs even faster. I mean LLMs are currently at the peak of hype. Thus we have more random looking posts.

28