Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I found this paper that helps answer the question: https://arxiv.org/pdf/1803.10228.pdf


When possible, one should link to the abstract page, if for no other reason than that it makes version tracking easier: Want, Wu, Essertel, Decker, and Rompf - Demystifying differentiable programming: shift/reset the penultimate backpropagator (https://arxiv.org/abs/1803.10228), a title which is entirely too cute for its own good.


Thanks Jade, didn’t realize I grabbed the wrong link.

This paragraph from the paper was helpful for me:

Differentiable programming is of joint interest to the machine learning and programming language communities. As deep learning models becomes more and more sophisticated, researchers have noticed that building blocks into a large neural network model is similar to using functions, and that some powerful neural network patterns are analogous to higher-order functions in functional programming [Fong et al. 2017; Olah 2015]. This is also thanks to the development of modern deep learning frameworks which make defining neural networks “very much like a regular program” [Abadi et al. 2017; LeCun 2018].


They have some interesting software: https://feiwang3311.github.io/Lantern/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: