Recent comments in /f/MachineLearning

tysam_and_co t1_jabuusq wrote

Hey, do what works for you, kid. (I say that respectfully). If you're passionate about writing a neural network in Fortran...

...then do it. If you follow one of the practical solutions here and it makes you miserable, then don't do that. There are easier ways than Fortran, but if you accept that it's not necessarily the most efficient or easy, like, at all, way to go about things, and this is a passion project for fun, then that's something you can learn with. Whatever fills your sails and gets you _learning things_.

I think you want to ask yourself why Fortran specifically (it's a low level language, you like it in particular, older supercomputing history, etc?)

Once you have written down all of the _why_ for what you want to do, make sure that you've chosen the right building blocks (the _what_) for it.

I hope this helps you on your journey. Much love and care from this fair enby over here! <3 <3 <3 :DDDD

1

weeeeeewoooooo t1_jabotnq wrote

There is a lot of outdated code in physics and chemistry. It is in Fortran because no one wants to rewrite it.

It is actually more difficult to parallelize fortran code on supercomputers compared to modern distributed computing libraries like HPX, which is a C++ library. Fortran will also perform much worse as you won't beat the DAG schedulers in HPX.

Fortran is also slower, because you can't write generic high performance code. This is also true of C. You need something akin to templating which allows you to optimize expressions at compile-time like you can do in C++. You need generic code for all but the most trivial operations, else it is very difficult and time consuming to build more complex operations like those that build into neural networks.

Additionally, CUDA is native C++ (and again can be optimized generically which you can't do once you have C or Fortran APIs), so if you want to seriously take advantage of vectorization, you should look at using GPUs. It is likely that if a place has a supercomputer, they probably also have a GPU cluster.

All these reasons are why C++ is the king of performance programming.

17

-xylon t1_jabmlj2 wrote

I'm not going to jump the "Fortran is dead" bandwagon as it is a language that continues to get used in simulation code and I feel it's never really going to be replaced. As you mentioned it has good properties for writing that kind of code, good compilers, etc.

That said, it is a very niche language, used mostly in the physics simulation & supercomputing world. And idk how popular NN are in that sector, but it seems to me it's a niche inside a niche... So you will need to dig a bit to get answers. Maybe the people at r/Fortran will know more.

7

AvailablePresent1113 t1_jablmv8 wrote

What a joke! I have a reviewer clearly champions for rejection and the two other reviews are completely yes-men. The rejecting reviewer did acknowledge some of my rebuttals, but then make some more non-sense excuses just to drag my paper down. The other reviewers just agree with anything he/she said without any clear stance. Seriously, I think there should be rules and restrictions for reviewers to say anything they conveniently want as "marginal"/"incremental"/...

1

Etterererererer OP t1_jabgx28 wrote

I would say I already have a good understanding on Python and have done many Deep learning and machine learning projects in the past using tensorflow and even making my own crappy framework. I’m more just looking towards learning more about the bare bone fundamentals of how to parallelize neural networks by making them from scratch so I can best understand it all.

0

CKtalon t1_jabg9b5 wrote

Fortran is pretty much a dead language though. People still use Fortran in their DFT computation only because no one has ported them over to a modern language. Since you are picking up a new language, just pick up Python to get the required support or you will be having a lot of trouble finding help.

8

Etterererererer OP t1_jabcxkh wrote

If your referring to Fortran, there’s gpu support using CUDA Fortran, the purpose of using computer clusters and CUDA Fortran and all that stuff seems to only be for ridiculously large data sets. To my knowledge NASA used such ideas when making simulations for rover landings and other things of the sort.

3

Etterererererer OP t1_jabbuok wrote

Many neural networks that are used in like physics chemistry and other fields are often written in Fortran and then trained on supercomputers. It’s mainly because Fortran is really easy to parallelize and even has built-in support for linear algebra. additionally most supercomputers run on Linux and to my knowledge fortran runs really well on Linux OS. I am also planning on learning C/C++ down the road for similar applications but currently I just think Fortran is more fun lol and again I’m doing it just for the fun of it.

Edit: this is to my knowledge again I am EXTREMELY new to Fortran but atleast from my google research it seems to be used in a range of large scale projects.

−3

Donno_Nemore t1_jabbex8 wrote

The risks would be a misalignment between the training data and your business data. Business apps with clever acronyms that have strong associations with sentiment would be the major risk.

The value-add of such a system is unclear at best. The IT help desk ticketing systems I am familiar will all have a review/feedback component. The IT staff already know all the A-holes. Are you trying to empirically prove who the A-holes are?

2

royalemate357 t1_jabauxj wrote

> I know it’s common for massive projects to use Fortran in order to train NN

Is it? I'm not aware of any high profile / large scale ml projects recently written in it. My understanding is they mostly use python for the model development, and then C/C++ for the actual math. Fwiw i think parts of numpy are written in fortran though.

27

maximalentropy t1_jaaigzj wrote

I went from 443 to 422. The reviewers mentioned new concerns that they didn’t previously express and focused on one writing nit that could be fixed in one sentence or less. Is it worth emailing the AC? I feel the reviewers were biased or tried to drag down the scores for no good reason

1