Recent comments in /f/MachineLearning
SnooMarzipans1345 t1_jan3fnu wrote
I'm new to this-- "whatever" topic
please explain to me this topic of op as if I am a child in TL:DR format.
Ps I did read, but what encryption is this "hero"!?
protonpusher t1_jan2jpe wrote
Reply to comment by HackZisBotez in [D] Are Genetic Algorithms Dead? by TobusFire
No, good on you for being that person. I just looked at the prefix of the URL. Thx!
boglepy t1_jan2f9k wrote
Following!
HackZisBotez t1_jan2bme wrote
Reply to comment by protonpusher in [D] Are Genetic Algorithms Dead? by TobusFire
Sorry to be that person, but that's not Nature, that's Scientific Reports, which is a tier 3 journal in the Nature portfolio. If Nature would be compared to a top conference, Scientific reports would be less than a workshop paper.
-EmpiricalEvidence- t1_jan0mqg wrote
Reply to comment by Dendriform1491 in [D] Are Genetic Algorithms Dead? by TobusFire
Exactly due to the computational demands I don't think genetic algorithms have ever really been "alive", but with compute getting cheaper I could see it seeing success similar to the rise of Deep Learning.
Evolution Strategies as stabilizers without the genetic component are already being deployed quite well e.g. AlphaStar.
Jeff Clune was quite active in that area of research and he recently joined DeepMind.
sebzim4500 t1_jan01xr wrote
Reply to comment by Timdegreat in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
Would you even want to? Sounds like overkill to me, but maybe I am missing some use case of the embeddings.
ShowerVagina t1_jamyp12 wrote
Reply to comment by Stakbrok in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
I might be in the minority but I strongly believe in unfiltered AI (or a minimal filter, only blocking thing like directions to cool drugs or make weapons). I know they filter it for liability reasons but I wish they didn't.
hershey678 t1_jamyfyh wrote
Reply to comment by Hunterhal in [D] Are Genetic Algorithms Dead? by TobusFire
Yeah it's used for automatic hyperparameter tuning.
Better than a grid or just intuition.
nfmcclure t1_jamy7yz wrote
Reply to [D] Are Genetic Algorithms Dead? by TobusFire
I don't think they are dead. Their popularity for NNs is much lower for sure.
In general, GAs can theoretically solve any problem (if you can formulate a fitness function), given long enough time. Because of that, I think they will always have some use cases.
Readityesterday2 t1_jamx6h5 wrote
Reply to [D] Are Genetic Algorithms Dead? by TobusFire
You can say they have gone extinct in the ecosystem of completion from superior approaches 😂
risoo7 t1_jamu5h2 wrote
Reply to [D] Are Genetic Algorithms Dead? by TobusFire
In one of our recent work https://arxiv.org/abs/2012.14956, we used Genetic Algorithm to attack NLP models in a hard label black box setting, where we do not have access to the confidence scores of the model.
lucidraisin t1_jamtx7b wrote
Reply to comment by fmai in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
it cannot, the compute still scales quadratically although the memory bottleneck is now gone. however, i see everyone training at 8k or even 16k within two years, which is more than plenty for previously inaccessible problems. for context lengths at the next order of magnitude (say genomics at million basepairs), we will have to see if linear attention (rwkv) pans out, or if recurrent + memory architectures make a comeback.
ShowerVagina t1_jamts00 wrote
Reply to comment by ---AI--- in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
Is that for everyone or just API/Enterprise users?
[deleted] t1_jamt0wc wrote
Reply to comment by LetterRip in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
[deleted]
[deleted] t1_jamslnp wrote
Reply to [D] Are Genetic Algorithms Dead? by TobusFire
[removed]
EducationalCicada t1_jamrqo2 wrote
Reply to [D] Podcasts about ML research? by Tight-Vacation-9410
The Gradient Podcast
Gradient Dissent Podcast
Lex Fridman's podcast has also had all the biggest names in AI as guests.
px05j t1_jamqlv1 wrote
tomd_96 t1_jamp6kt wrote
Reply to comment by LetterRip in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
Where was this introduced?
filipposML t1_jamongz wrote
Reply to comment by M_Alani in [D] Are Genetic Algorithms Dead? by TobusFire
We recently published an evolutionary method to sample from the latent space of a variational autoencoder. It is still alive and well. Just a bit niche.
Dekans t1_jamokhr wrote
Reply to comment by fmai in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
> We also extend FlashAttention to block-sparse attention, yielding an approximate attention algorithm that is faster than any existing approximate attention method.
...
> FlashAttention and block-sparse FlashAttention enable longer context in Transformers, yielding higher quality models (0.7 better perplexity on GPT-2 and 6.4 points of lift on long-document classification) and entirely new capabilities: the first Transformers to achieve better-than-chance performance on the Path-X challenge (seq. length 16K, 61.4% accuracy) and Path-256 (seq. length 64K, 63.1% accuracy).
In the paper bold is done using the block-sparse version. The Path-X (16K length) is done using regular FlashAttention.
---AI--- t1_jamo555 wrote
Reply to comment by ShowerVagina in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
OpenAI updated their page to promise they will stop doing that.
IanisVasilev t1_jamldhj wrote
Reply to [D] Are Genetic Algorithms Dead? by TobusFire
It turned out that other models have superior genes.
SpookyTardigrade t1_jaml6d9 wrote
Reply to comment by Hostilis_ in [D] Are Genetic Algorithms Dead? by TobusFire
Can you give a few examples of how genetic algorithms and stochastic gradient estimation are related?
Pfohlol t1_jamirgc wrote
Reply to [D] Podcasts about ML research? by Tight-Vacation-9410
The TWIML AI podcast with Sam Charrington
ahf95 t1_jan46jv wrote
Reply to [D] Are Genetic Algorithms Dead? by TobusFire
They are certainly still used for RL (and other cases where you don’t have a gradient), but even in those contexts there have been modern advancements that cause the preferred algorithms to diverge from old-school genetic algorithms. For instance, things like Particle Swarm Optimization and the Cross Entropy Method have their conceptual origins in the similar sampling regimes as MCMC approaches, but they’ve become their own entities at this point, outperforming genetic algorithms, and really being unique and broad enough to get their own categories.