Recent comments in /f/MachineLearning
ckperry t1_j8xgy80 wrote
Reply to comment by Ronny_Jotten in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
This was a mistake that only impacted DK yes
ckperry t1_j8xgvnp wrote
Reply to comment by FHIR_HL7_Integrator in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
This was a bug. Sorry. Fixing asap.
ckperry t1_j8xgugf wrote
Reply to comment by Ulfgardleo in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
This was a mistake and only impacted DK
ckperry t1_j8xgt75 wrote
Reply to comment by bananskaftet in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
This was a mistake, and only impacted DK.
ckperry t1_j8xfzm2 wrote
Reply to [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
[edit] This is fixed now. The prices shown in DK were incorrect, but afaict all users were charged correct amounts. If I'm wrong and someone was charged incorrectly, they can reach out at colab-billing@google.com
Hi, I lead product for Colab. Thanks for flagging. This is clearly a mistake and we're looking into how it slipped through our testing.
We'll get this fixed asap and proactively issue refunds to anyone impacted. We haven't changed prices for Colab Pro.
Sorry about this. If you hit weird things in the future, I'm @thechrisperry on twitter (I check that a little more religiously than reddit where I mostly lurk).
ekbravo t1_j8xfsdp wrote
Reply to comment by jimliu741523 in [R] The Table Feature Transformation Library Release by jimliu741523
Thank you for your response. I’m following you.
velcher t1_j8xfdd7 wrote
Reply to [D] Coauthor Paper? by [deleted]
In general, yes, being middle author in papers with > 3 authors is not great. It's better than having nothing though.
The best outcome you can get as 2nd author is 2nd author of a 3 author paper (PhD, Undergrad, Prof), contribute seriously to the project, and get a good letter of recommendation from the Professor that says you contributed seriously to the project.
Someoneoldbutnew t1_j8xewn6 wrote
Reply to [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
Just checked, my US prices are the same.
jimliu741523 OP t1_j8xatge wrote
Reply to comment by ImpossibleCat7611 in [R] The Table Feature Transformation Library Release by jimliu741523
The HeadJack framework and also were designed by ourself, so we had a paper, which summited to a ML conference and in the double-blind process, that it is not convenient public right not, but the framework was based on GAN with cross-domain and self-supervised learning. We will open it in the future : )
jimliu741523 OP t1_j8x9047 wrote
Reply to comment by ekbravo in [R] The Table Feature Transformation Library Release by jimliu741523
Thanks for your kindly words, this is an open version not for enterprise. The enterprise one did not released the dataset into the wild, the feature model only put on your privacy pool. In the future version, we will consider replace account info with API key.
Wild_Basil_2396 t1_j8x8bdf wrote
Reply to [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
In Indian, the prices remain the same as the previous listed prices.
Ronny_Jotten t1_j8x62do wrote
Reply to comment by FreePenalties in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
It's an error. They've obviously listed the price in Danish Kroner, but with a euro sign by mistake. That didn't occur to you? The actual price in euros if you convert it from Kroner is about €12.73 for Pro and €58.16 for Pro+. Maybe you want to delete this post.
hpstring t1_j8x5iyh wrote
Reply to comment by drinkingsomuchcoffee in [D] HuggingFace considered harmful to the community. /rant by drinkingsomuchcoffee
I'm a beginner in this field and I was wondering what it means for code to be "centralized" and "dry". Does "centralized" mean putting a lot of code in a single file and "dry" means raw code that is not very easy to read but is efficient or have some other advantages?
tysam_and_co t1_j8x4sjv wrote
I have been torn about Huggingface. They provide some wonderful services to the community, but unfortunately the API design is very unintuitive and hard to work with, as well as the documentation being outdated. Also, much of the design tries to accommodate too many standards at once, I think, and switching between them or doing other likewise things requires doing in-place operations or setting markers that permanently become part of an object instead of a chain that I can update with normal control flow operations.
This also includes that there are far too many external libraries as well that are installed with any hf stuff, and the library is very slow to load and to work with. I avoid it like the plague unless I'm required to use it, because it usually takes the most debugging time. For example, I spent well over half the time implementing a new method trying to debug huggingface before just shutting down the server because I had already spent an hour, hour and a half on tracing through the source code to try to fix it. And when I did, it was incredibly slow.
Now, that said, they also provide free models, and free access to datasets, like Imagenet. Do I wish it was an extremely light, fast, and simple wrapper? Yes. That would be great. But they do provide what they provide, and they put in a lot of effort to try to make it accessible to everyone. That's something that should not be ignored because of any potential personal beefs with the library.
All in all, it's a double-edged sword, and I wish there was a bit more simplicity, focus, self-containment, understandability and speed with respect to the hf codebase at large. But at the same time, I sincerely appreciate the models and datasets services that they offer to the community, regardless of the hoops one might have to add to get it. If one stays within the HF ecosystem, certain things are indeed pretty easy.
I hope if anyone from HF is reading this that this doesn't feel like a total dunk or anything like that. Only that I'm very torn because it's a mixed bag, and I think I can see that a lot of care really did go into a lot of this codebase, and that I think it really could be tightened down a ton for the future. There are positives about HF despite my beefs with the code (HF spaces included within this particular calculus at hand).
emotionalfool123 t1_j8x49h8 wrote
Reply to comment by dj_ski_mask in [Discussion] Time Series methods comparisons: XGBoost, MLForecast, Prophet, ARIMAX? by RAFisherman
Then it seems this is equivalent to the confusion R timeseries libraries cause.
RAFisherman OP t1_j8x39qj wrote
Reply to comment by pyfreak182 in [Discussion] Time Series methods comparisons: XGBoost, MLForecast, Prophet, ARIMAX? by RAFisherman
After skimming the paper, it seems like time to vec is kind of like a “seasonality” factor (kind of like what prophet out puts). Is that true?
[deleted] t1_j8x36kz wrote
Reply to comment by fasttosmile in [D] HuggingFace considered harmful to the community. /rant by drinkingsomuchcoffee
[deleted]
RAFisherman OP t1_j8x2wdw wrote
Reply to comment by pyfreak182 in [Discussion] Time Series methods comparisons: XGBoost, MLForecast, Prophet, ARIMAX? by RAFisherman
Didn’t think of that. Will take a look!
I do care about interpretability to some point, which is why embeddings sounds complex. But I’m now curious for sure.
BenXavier t1_j8x2m9b wrote
Reply to comment by weeeeeewoooooo in [Discussion] Time Series methods comparisons: XGBoost, MLForecast, Prophet, ARIMAX? by RAFisherman
Hey, this Is quite interesting - but beyond my radar. I know that eigenvalues are derived from Linear transformations, how do you expose the linear component of a given ts model by recursively using it?
Sorry for the basic question: tutorials, books and references are welcome
dj_ski_mask t1_j8x2m11 wrote
Reply to comment by emotionalfool123 in [Discussion] Time Series methods comparisons: XGBoost, MLForecast, Prophet, ARIMAX? by RAFisherman
I am knee deep in this library at work right now.
Pros: they implement tons of algos and regularly update with the ‘latest and greatest,’ like NHITS. Also can scale with GPUs/TPUs for the algos that use Torch backend. Depending on the algo you can add covariates and the “global” models for multivariate time series are impressive in their performance.
Cons: my god it’s a finicky library that takes considerable time to pick up. Weird syntax/restrictions for scoring and evaluating. Differentiating between “past” and “future” covariates is not as cut and dried as documentation makes it seem. Also, limited tutorials and examples.
All in all I like it and am making a speed run to learning this library for my time series needs.
To OP I would suggest NHITS, but also, the tree based methods STILL tend to win with the data I work with.
TenaciousDwight t1_j8x1o3r wrote
Reply to [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
Mine is still $9.99/mo. Guess I can put my pitchfork away.
Tripanes t1_j8x108o wrote
Reply to comment by pennomi in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
KoboldAI is a way to run whatever engines you can throw into it. It's more a UX layer than any specific AI.
That said, it has the best ones I'm aware of.
Academic-Poetry t1_j8x0owj wrote
Reply to [D] Short survey of optimization methods by medwatt
Algorithms for Optimization by Mykel J. Kochenderfer and Tim A. Wheeler
Accessible introduction into a variety of methods, with code examples in Julia.
weeeeeewoooooo t1_j8wyqaj wrote
Reply to [Discussion] Time Series methods comparisons: XGBoost, MLForecast, Prophet, ARIMAX? by RAFisherman
You should probably try all four. There are some simple ways for you to do comparisons yourself. You can easily compare time-series models and the robustness of their training by using them to recursively predict the future by feeding their outputs back into themselves (regardless if they were trained in that fashion).
This will expose the properties of the eigenvalues of the model itself. Failure of a time-series model to match the larger eigenvalues of a system means it is failing the fundamentals and not able to capture the most basic global properties of the system you are trying to fit.
You don't necessarily have to do any fancy calculations. If the models fail to maintain the same qualitative patterns apparent in the original data over long time periods of self-input, then that means they are failing to capture the underlying dynamics. Many models eventually explode or decay to some fixed point (like a cycle or fixed value). This is a red flag that either the model is inadequate or training has failed you.
A simple dummy test for this would be training on something like a spin glass or Lorenz attractor, any kind of chaotic system really. Or just look along any interesting dimension of the data that you are using. A good model when recursively applied to itself will look very similar to the original signal in how it behaves regardless of phase.
ckperry t1_j8xh0sg wrote
Reply to comment by No_Dust_9578 in [N] Google is increasing the price of every Colab Pro tier by 10X! Pro is 95 Euro and Pro+ is 433 Euro per month! Without notifying users! by FreePenalties
Per our terms of service we must give 30 days notice before price changes. This was a mistake and we're fixing ASAP. We'll refund all impacted.
[edit to reflect tax inclusivity] one thing to mention is we recently updated Colab advertised pricing to be tax inclusive in the EU, so our advertised pricing did increase to reflect taxes; it should not have changed actual prices paid.