Knet: beginning deep learning with 100 lines of Julia
Yuret, D. (2016). Knet: beginning deep learning with 100 lines of Julia, in: NIPS 2016: Proceedings of the 30th International Conference on Neural Information Processing Systems, Barcelona, Spain, December 5-10, 2016 .
In: (2016). NIPS 2016: Proceedings of the 30th International Conference on Neural Information Processing Systems, Barcelona, Spain, December 5-10, 2016. [S.n.]: [s.l.]. , more
| |
Abstract |
Knet (pronounced "kay-net") is the Koç University machine learning framework implemented in Julia, a high-level, high-performance, dynamic programming language. Unlike gradient generating compilers like Theano and TensorFlow which restrict users into a modeling mini-language, Knet allows models to be defined by just describing their forward computation in plain Julia, allowing the use of loops,conditionals, recursion, closures, tuples, dictionaries, array indexing, concatenation and other high level language features. High performance is achieved by combining automatic differentiation of most of Julia with efficient GPU kernels and memory management. Several examples and benchmarks are provided to demonstrate thatGPU support and automatic differentiation of a high level language are sufficient for concise definition and efficient training of sophisticated models. |
Dataset |
- Barth, A.; Herman, P.M.J.; (2018): Neural network modelling of Baltic zooplankton abundances. Marine Data Archive, more
|
|