Recent comments in /f/MachineLearning
Quazar_omega t1_jaxnwrk wrote
Reply to comment by ddproxy in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
Too late
> I regret this immediately.
This statement is false.
- ChatGPT
Thin_Sky t1_jaxmuu6 wrote
Reply to comment by harharveryfunny in [D] OpenAI introduces ChatGPT and Whisper APIs (ChatGPT API is 1/10th the cost of GPT-3 API) by minimaxir
Thanks!
radi-cho OP t1_jaxm1qw wrote
Reply to [P] diffground - A simplistic Android UI to access ControlNet and instruct-pix2pix. by radi-cho
The app: https://play.google.com/store/apps/details?id=com.radicho.diffground
Much more models and an iOS version are coming soon:)
ddproxy t1_jaxkegs wrote
Reply to comment by ddproxy in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
I regret this immediately.
ddproxy t1_jaxju3a wrote
Reply to comment by Quazar_omega in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
Ministry of Truth
chic_luke t1_jaxhi06 wrote
Reply to comment by Quazar_omega in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
"Open"AI
[deleted] t1_jaxc6sf wrote
Reply to comment by ank_itsharma in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
Yeah, I think so.
BeautifulLurker t1_jax84v8 wrote
Could you DM me that .tgz? You see, I'm Satoshi and have been looking for that file for a while.
ank_itsharma t1_jax7x0s wrote
Reply to comment by [deleted] in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
On a similar context, we can fine tune OpenAI API for a particular set of data, right?
Quazar_omega t1_jax5sju wrote
Reply to comment by ginger_beer_m in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
I swear, sooner or later they'll change name into some dystopian stuff like EthicalAI or something since they aren't much open anymore, but still want to keep a "good" face
WarAndGeese t1_jax2k9d wrote
Reply to comment by WarAndGeese in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
I think I would prefer that ends up not being the case, but I can see the trajectory of how it would be.
WarAndGeese t1_jax2je6 wrote
I imagine that stuff like this will be the future of interacting with computers, at least to a large extent, but it's frustrating how people sacrifice certainty for 'the probability of it being right are good enough'.
rumovoice OP t1_jax22id wrote
Reply to comment by Zieng in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
the availability is meh, 99%
Zieng t1_jax04t4 wrote
I'll try it out! but how is the API availability? bc the availability on the chatbot at least is too low, for free tier :(
maxToTheJ t1_jawzoal wrote
Reply to comment by eunit250 in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
I currently am sadly tethered to living in the present
eunit250 t1_jawyt1l wrote
Reply to comment by maxToTheJ in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
For now.
[deleted] t1_jawynxx wrote
Reply to comment by maxToTheJ in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
[deleted]
n8mo t1_jawxx5t wrote
Reply to comment by hiptobecubic in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
Not me googling the syntax for ffmpeg commands every time I have to use it LMFAO
rpnewc t1_jawxrjh wrote
Yes ChatGPT does not have any idea about what trophy is, or a suitcase is or what brown is. But it has access to a lot of sentences with these words and hence some attributes of it. So when you ask these questions, sometimes (random sampling) it picks the correct noun as the answer, other times it picks the wrong one. Ask a logic puzzle with ten people as characters. See its reasoning capability.
crayphor t1_jawrahh wrote
Reply to comment by DaTaha in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
You can use a smaller model like GPT-2. You are not going to get ChatGPT performance without a terabyte of VRAM, but if you want to try something locally, GPT-2 exists.
DaTaha t1_jawprrx wrote
Reply to comment by [deleted] in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
What options does one have if one wants ChatGPT-like functionality but without actually reaching out to OpenAI or other such online services?
[deleted] t1_jawnu6c wrote
Reply to comment by svantevid in Did you get access to Meta AI's LLAMA? [Discussion] by WittyBananaPeel
[deleted]
ginger_beer_m t1_jawngph wrote
Reply to comment by [deleted] in [P] LazyShell - GPT based autocomplete for zsh by rumovoice
OpenAI .. LOL
ACH-S t1_jawnajq wrote
Reply to [D] Are Genetic Algorithms Dead? by TobusFire
I'm not sure whether you mean genetic algorithms or evolutionary algorithms or if those terms are interchangeable for you (often, they are not). Anyway, a field that heavily relies on them is Quality-Diversity (https://quality-diversity.github.io/, there is a nice list of papers there). Also, I would recommend that you have a look at the proceedings from the GECCO conference (e.g. https://dl.acm.org/doi/proceedings/10.1145/3512290 , the conference is much smaller than neurips/ICML/etc, and the research quality tends to be a bit more variable, but you'll see that evo algortihms, and in particular genetic ones are far from being dead).
The idea that "designing an experiment for a genetic algorithm requires sufficient prior" doesn't sound correct to me, generally you turn to them when you don't have any reliable priors on the search space (as other comments have pointed out, see CMA-ES as an example. I'll add ES https://arxiv.org/abs/1703.03864 as another useful example that I've personally often used to simplify meta-learning problems).
rpnewc t1_jaxr6qk wrote
Reply to [P] diffground - A simplistic Android UI to access ControlNet and instruct-pix2pix. by radi-cho
Cool..