Recent comments in /f/MachineLearning
TeamRocketsSecretary t1_j8z7pqs wrote
Reply to comment by medwatt in [D] Short survey of optimization methods by medwatt
Given your reply I’m unsure of why you would want be able to follow the proofs then?
Some of the proofs in optimization are particularly rough so if you want to understand them the only way to is to wade through a book or the very least online lecture videos + slides.
[deleted] t1_j8z74lr wrote
NotARedditUser3 t1_j8z5dy8 wrote
Imagine someone writes one that's explicitly aimed around manipulating your thoughts and actions.
An AI could likely come up with some insane tactics for this. Could feed off of your twitter page, find an online resume of you or scrape other social media or in microsoft's case or google's, potentially scrape your emails you have with them, profile you in an instant, and then come up with a tailor made advertisement or argument that it knows would land on you.
Scary thought.
mocny-chlapik t1_j8z3vox wrote
How should we control the exposure for people with low cognitive capabilities that might not understand what they are interacting with.
clisztian t1_j8z3t1r wrote
Reply to comment by dj_ski_mask in [Discussion] Time Series methods comparisons: XGBoost, MLForecast, Prophet, ARIMAX? by RAFisherman
I guarantee you a state space model will beat out any fancy named transformer for most “forecastable” problems. Even MDFA - signal extraction + exp integration for forecasting - will beat out these big ML models
lack_of_novelty006 t1_j8z1mc7 wrote
Reply to [D] Coauthor Paper? by [deleted]
You generally won't have any value for an ICML submission even if you are the 1st author, however, being a co author for an accepted ICML paper always counts. It should also improve your LOR. Regarding EMNLP, 3rd author would add some value if it's a long paper. For short a paper it's of little value IMHO. Again the paper should be accepted, 0 value for submitted and rejected papars.
Captain_Cowboy t1_j8z1lv4 wrote
Reply to comment by ckperry in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
Maybe a little easier:
import sys
probably_colab = 'google.colab' in sys.modules
prehensile_dick t1_j8z0dgl wrote
Corporations scraping all kinds of copyrighted materials and then profiting off the models while the people doing all the labor are getting either nothing (for content generation) or poverty wages (for content labellers).
Their current push to promote LLMs as some sort of pinnacle of technology, when they barely have any legitimate use-cases and struggle with the most basic of logic, will probably lead to a recession in the tech industry.
kau_mad t1_j8yy3yy wrote
Reply to comment by athos45678 in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
I use Colab because of its easy integration to Google Drive. What do you use for storage on Paperspace? I don’t want to pay another service (S3) for storage.
medwatt OP t1_j8yxl3j wrote
Reply to comment by TeamRocketsSecretary in [D] Short survey of optimization methods by medwatt
I'm neither a mathematician nor a computer scientist by trade. I don't have the need nor the time to go through the nitty gritty details of optimization theory. All I need is an overview of the main ideas in this field. Think of it like knowing how to use a limit without the need to go through the epsilon-delta definition. Hope my reply didn't offend your ego.
TeamRocketsSecretary t1_j8ywdem wrote
Reply to comment by medwatt in [D] Short survey of optimization methods by medwatt
Lol dude you wanna learn optimization, the details and length are what make the subject. If you want a high level overview look at a blog post. At the very least find an online offering of an optimization course with lecture videos and watch those and read the slides if you can’t be bothered to open a textbook.
All these low effort posts in this sub about people just looking to cut corners are depressing.
ach224 t1_j8ywbup wrote
Reply to comment by ckperry in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
I have been using colab for ML training big models. It is a good product. Would love a more straightforward integration with my proper python modules. I end up having to update my libraries on my local, pushing to gh, then refresh/restart the colab kernel. Would love a better way to run my python module in -e mode. Thanks.
marcus_hk t1_j8ypfa6 wrote
Take a look at Hebbian and adaptive resonance models. No backprop, no distinct training/inference phases.
machineko t1_j8yo6fd wrote
Reply to comment by askingforhelp1111 in [D] Speed up HuggingFace Inference Pipeline by [deleted]
Depends on what models you are using but for most transformers, running on GPUs may be much more efficient than CPUs when you consider $ / M inferences (or inf/$).
Are there specific EC2 instances you have to use or can you deploy on any EC2 instance?
___luigi t1_j8ymhhk wrote
Reply to [Discussion] Time Series methods comparisons: XGBoost, MLForecast, Prophet, ARIMAX? by RAFisherman
Recently, we started evaluating Time Series Transformers. TSTs showed good performance in comparison to other TS DL methods.
Sandy_dude OP t1_j8ylnxn wrote
Reply to comment by Cocaaah in [R] Looking for papers which are modified variational autoencoder (VAE) by Sandy_dude
Okay thank you. May I ask you a couple of questions if they arise ?
jamesvoltage t1_j8yjrqo wrote
Reply to comment by MysteryInc152 in [R] RWKV-4 14B release (and ChatRWKV) - a surprisingly strong RNN Language Model by bo_peng
State space models (S4, H3, etc) are also competitive with 2B param transformer language models and have an effectively infinite context window https://hazyresearch.stanford.edu/blog/2023-01-20-h3
ckperry t1_j8yhh4f wrote
Reply to comment by Appropriate_Ant_4629 in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
Thanks! Though a lot of thanks to my buddy in Google Brain who saw this thread and pinged me this morning :)
ckperry t1_j8ygy9m wrote
Reply to comment by daking999 in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
I was so close to getting this and then our partner team got hit by layoffs which set us back. Hoping before next school year to have something (not perfect), but we'll see.
ckperry t1_j8ygsbl wrote
ckperry t1_j8ygpc8 wrote
Reply to comment by [deleted] in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
hey now no need to be snarky
ckperry t1_j8ygjep wrote
Reply to comment by DigThatData in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
[edit] I give up on formatting
We've never really worked on a foilproof way to detect if you're using Colab, but this might work a little better for you:
import sys probably_colab = False
if 'google.colab' in sys.modules: probably_colab = True
[deleted] t1_j8ydy6u wrote
keepthepace t1_j8ycv83 wrote
Reply to comment by ckperry in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
Ah yes, several EU countries started sending warning shots about it. Makes sense. Good luck for the production fix on friday evening!
BronzeArcher OP t1_j8z7vfi wrote
Reply to comment by NotARedditUser3 in [D] What are the worst ethical considerations of large language models? by BronzeArcher
Yeah that’s pretty frightening.