Recent comments in /f/MachineLearning

TeamRocketsSecretary t1_j8z7pqs wrote

Given your reply I’m unsure of why you would want be able to follow the proofs then?

Some of the proofs in optimization are particularly rough so if you want to understand them the only way to is to wade through a book or the very least online lecture videos + slides.

7

NotARedditUser3 t1_j8z5dy8 wrote

Imagine someone writes one that's explicitly aimed around manipulating your thoughts and actions.

An AI could likely come up with some insane tactics for this. Could feed off of your twitter page, find an online resume of you or scrape other social media or in microsoft's case or google's, potentially scrape your emails you have with them, profile you in an instant, and then come up with a tailor made advertisement or argument that it knows would land on you.

Scary thought.

21

lack_of_novelty006 t1_j8z1mc7 wrote

You generally won't have any value for an ICML submission even if you are the 1st author, however, being a co author for an accepted ICML paper always counts. It should also improve your LOR. Regarding EMNLP, 3rd author would add some value if it's a long paper. For short a paper it's of little value IMHO. Again the paper should be accepted, 0 value for submitted and rejected papars.

−4

prehensile_dick t1_j8z0dgl wrote

Corporations scraping all kinds of copyrighted materials and then profiting off the models while the people doing all the labor are getting either nothing (for content generation) or poverty wages (for content labellers).

Their current push to promote LLMs as some sort of pinnacle of technology, when they barely have any legitimate use-cases and struggle with the most basic of logic, will probably lead to a recession in the tech industry.

4

medwatt OP t1_j8yxl3j wrote

I'm neither a mathematician nor a computer scientist by trade. I don't have the need nor the time to go through the nitty gritty details of optimization theory. All I need is an overview of the main ideas in this field. Think of it like knowing how to use a limit without the need to go through the epsilon-delta definition. Hope my reply didn't offend your ego.

−1

TeamRocketsSecretary t1_j8ywdem wrote

Lol dude you wanna learn optimization, the details and length are what make the subject. If you want a high level overview look at a blog post. At the very least find an online offering of an optimization course with lecture videos and watch those and read the slides if you can’t be bothered to open a textbook.

All these low effort posts in this sub about people just looking to cut corners are depressing.

3

ach224 t1_j8ywbup wrote

I have been using colab for ML training big models. It is a good product. Would love a more straightforward integration with my proper python modules. I end up having to update my libraries on my local, pushing to gh, then refresh/restart the colab kernel. Would love a better way to run my python module in -e mode. Thanks.

2

ckperry t1_j8ygjep wrote

[edit] I give up on formatting

We've never really worked on a foilproof way to detect if you're using Colab, but this might work a little better for you:

import sys probably_colab = False

if 'google.colab' in sys.modules: probably_colab = True

11