Recent comments in /f/MachineLearning
No-Garlic9132 t1_j87mbfm wrote
No_Network_3714 t1_j87lp29 wrote
Reply to comment by logsinh in [D] Are there any AI model that I can use to improve very bad quality sound recording? Removing noise and improving overall quality by CeFurkan
I am also interested in having you process two recordings. They both a little over 40 minutes in length. If you feel you can do this, please contact me at (email address removed). Thanks.
OptimizedGarbage t1_j87jazn wrote
Reply to comment by DoxxThis1 in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
So, yes and no. You really do not want to make the type annotations be in plain English, because in the Curry-Howard correspondence, the types correspond to theorems, and the code corresponds to a proof of those theorems. It's one thing if you know the theorem, but don't see the proof. You can often trust that the proof is right without seeing it. But you really need to know what the theorem is. If you start with English and generate some type behind the scenes, you don't see what the actual theorem is, but just know that the system has proved 'some theorem' about your code. As the programmer you have no idea what this actually tells you, so it kinda defeats the point of using static typing in the first place.
That said, you *can* write down a desired type and have a system write down a ton of type annotations or generate a bunch of code to prove that the type you wrote down is satisfied by your program. There's been recent work on this in deep learning for theorem proving, such as this work which uses GPT for proving theorems in Lean, a dependently type programming language and theorem prover. A better approach though would be to combine this with an actual tree search algorithm to allow a more structured search over the space of proofs, instead of trying to generate full correct proofs in one shot. Hypertree Proof Search does this, using a variant of AlphaZero to search and fine-tune the neural net. Unfortunately it hasn't been open-sourced though, and it's pretty compute intensive, so we can't use this for actual type inference yet. But yeah there's active interest in doing this kind of thing, both as a proving ground for using RL for reasoning tasks and from mathematicians for theorem-proving.
[deleted] t1_j87hnsi wrote
Reply to comment by jloverich in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
[deleted]
DoxxThis1 t1_j87h9gx wrote
Reply to comment by OptimizedGarbage in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
I wonder if GPT could be leveraged to create an NLP-based type system. The programmer annotates the types in plain English, and the AI hallucinates the appropriate theorem-proving axioms! It would be an interesting "dog-fooding" of AI/ML for easier AI/ML development.
EDIT Holy cow what did I say to deserve so many downvotes? The one response below makes me think it's not such a wild idea.
lmericle t1_j87fykv wrote
Vivid-Vibe t1_j87fdst wrote
Reply to [P] Introducing arxivGPT: chrome extension that summarizes arxived research papers using chatGPT by _sshin_
Does this have an API endpoint?
noobgolang t1_j87eve1 wrote
Patent on transformer is like a patent on physics, its hillarious
jloverich t1_j87aynr wrote
Reply to comment by Calm_Motor4162 in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
Meta is also working on shumai which is javascript/typescript and looks like pytorch.
A_Light_Spark t1_j8792en wrote
Reply to comment by Trakeen in [P] Introducing arxivGPT: chrome extension that summarizes arxived research papers using chatGPT by _sshin_
They did find some correlations. This type of meta analysis is not uncommon nowadays but few avoid answering the question as much as this paper.
[deleted] t1_j877f0b wrote
Reply to comment by Franck_Dernoncourt in [P] Introducing arxivGPT: chrome extension that summarizes arxived research papers using chatGPT by _sshin_
[deleted]
[deleted] t1_j8766l3 wrote
MattRix t1_j872nhe wrote
Reply to comment by SatoshiNotMe in [P] Introducing arxivGPT: chrome extension that summarizes arxived research papers using chatGPT by _sshin_
Some people also figured out that if you pass in the right model id to the regular GPT API, you get ChatGPT (not sure if this has been blocked since it was discovered).
Rhannmah t1_j871jk7 wrote
My opinion is that patents are garbage and should be put in the dumpster and set fire to with gasoline.
OptimizedGarbage t1_j86zehe wrote
Reply to [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
People talking about "can't you just do this in X?" are seriously underestimating how difficult a problem this actually is. There's a good reason why it's hard to do, which is that this kind of static type checking requires refinement types (where typechecking is NP-hard). Basically, by including values in the types, typechecking becomes way harder -- the types contain a limited program that involves arithmetic, and sound typing requires you to prove that the dimensions add up correctly. So your type system needs to include a notion of arithmetic, except that because of Godel's incompleteness theorem, any logical system that includes integer arithmetic is undecidable. So now this is basically stepping from traditional static typechecking to something like an automated theorem prover, unless you get very careful and clever with how you set up the problems. That can mean one of two things -- either you write a ton of type annotations (like more code than the actual program) to prove to the type system that your program is valid. Or you can hook up an automated theorem prover to prove the soundness of your program automatically, at the cost of type-checking being NP-hard or worse.
This can be worth it, and there are potentially ways of making this tractable, but it's very non-trivial -- you basically need a type system that's dedicated to this problem specifically, not something that you can stick onto an existing language's type system.
That said, there are some things that try to do this. Haskell has a port of torch called HaskTorch that includes this kind of typed tensor shapes, and calls the Z3 theorem prover on the backend to solve type inference. It can get away with this because of the LiquidHaskell compiler extension, which has refinement types capable of solving this kind of typing problem, and is already pretty mature. Dex is a research language from Google that's based on Haskell and built to explore this kind of typechecking. Really you'd want to do this in Rust, because that's where the tradeoff of speed and safety for convenience makes the most sense, but rust is just barely on the edge of having a type system capable of this. You have to get really clever with the type system to make it work at all, and there's been no sustained push from any company to develop this into a mature solution. Hopefully something better comes along soon
WokeAssBaller t1_j86xz9r wrote
Reply to [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
Dex is the closest that comes to mind.
With how deep the python ecosystem is, and how fast LLMs are moving, the next language for ML will likely be English
__lawless t1_j86xhtr wrote
Reply to comment by wittfm in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
That was swift for tensorflow. Did not pan out.
farmingvillein t1_j86xbpu wrote
Reply to comment by impossiblefork in [D] Can Google sue OpenAI for using the Transformer in their products? by t0t0t4t4
Additionally, Google has released many open source repositories with transformers and appropriate licensing.
tyranids t1_j86xbn7 wrote
Reply to comment by jloverich in [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
Imo Fortran is very capable for this and I’m surprised there isn’t more than Neural Fortran real. Nvfortran will even compile for you to offload to nvidia GPU
Franck_Dernoncourt t1_j86wojs wrote
Reply to comment by Sola_Maratha in [P] Introducing arxivGPT: chrome extension that summarizes arxived research papers using chatGPT by _sshin_
Why not impressive?
eigenham t1_j86v8br wrote
Reply to comment by currentscurrents in [D] Can Google sue OpenAI for using the Transformer in their products? by t0t0t4t4
I agree that they should be, but in my experience it's been difficult to get algorithms patented that lawyers felt would actually protect IP. Basically they're not that well tested in court so it's yet to be seen how much protection they'll provide.
Calm_Motor4162 t1_j86v280 wrote
Reply to [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
JavaScript may work, even there is a complete book on TensorFlow.js
currentscurrents t1_j86uoft wrote
Reply to comment by eigenham in [D] Can Google sue OpenAI for using the Transformer in their products? by t0t0t4t4
Yeah, Alice Corp. v. CLS Bank significantly limited the scope of software patents. It ruled that adding "on a computer" to an abstract idea does not itself make it patentable.
I believe that true inventions of real algorithms (like movie codecs) are still patentable though.
mil24havoc t1_j86qv1s wrote
Reply to [D] Have their been any attempts to create a programming language specifically for machine learning? by throwaway957280
Well, sort of. There's CUDA and Julia and Scala, right?
muchcharles t1_j87nheg wrote
Reply to [D] Can Google sue OpenAI for using the Transformer in their products? by t0t0t4t4
Schmidhuber prior art