Recent comments in /f/MachineLearning

AdFew4357 t1_j5pqkqb wrote

I have one minor gripe about deep learning textbooks. I think they are great references, but should not be used as a way for beginners to get into the field. I genuinely feel like time is better spent on the student going down a rabbit hole of actual papers of maybe one of the chapters of those books, say, a student reads the chapter on graph neural networks and the proceeds to read everything in graph neural networks, rather than read the whole book on different subsections.

1

Cyclone4096 t1_j5owtmi wrote

I don’t have too much background on ML. I want to build a fairly small neural network that has only one input which comes from a time series data and has to give only one output for that data. My loss function aggregates the entire time series output to get a single scalar value. I’m using PyTorch and when I call “.backward()” on the loss function it takes a long time (understandably). Is there an easier way to do this rather than doing backward gradient calculation on a loss function that itself is a result if 100s of millions values? Note that the neural network itself is tiny, maybe less than 100 weights, but my issue is that I don’t have any golden target, but I want to minimize a complex function calculated from the entire time series output.

1

terranop t1_j5of10a wrote

Is there a reason why trans people (specifically trans men, since trans people of other genders would qualify under the "gender" criterion) are not included in the URM criteria? It seems kinda odd to include all these other minority groups but not trans people. Are transgender people not actually underrepresented at ICLR?

4

SimonJDPrince OP t1_j5ocrdo wrote

I'd say that mine is more internally consistent -- all the notation is consistent across all equations and figures. I have made 275 new figures, whereas he has curated existing figures from papers. Mine is more in depth on the topics that it covers (only deep learning), but his has much greater breadth. His is more of a reference work, whereas mine is intended mainly for people learning this for the first time.
Full credit to Kevin Murphy -- writing book is much more work than people think, and so completing that monster is quite an achievement.

Thanks for tip about Hacker News -- that's a good idea.

3