Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Differential programming is a generalization of deep learning, so all the same practical examples from deep learning apply. The difference, however, is that in DP you can write your "models" simply as functions. This means than in an ideal DP framework, the code you write needs no special types and no special syntax. This would be particularly useful if you're using some other library's code which was not written with DP/DL in mind at all (think ODE solvers, optimization routines, etc.); that code can be added to your "model" (again, just a regular function), and it will just work!

The difficulties of course lie in implementing such a framework, verifying whether your program is truly differentiable, and so on. The author of this blog post has written a working prototype framework in Julia called Zygote.jl. In short, it achieves DP through metaprogramming, applying source code transformation techniques at compile time, and it has already been quite successful.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: