Very Deep (Infinite) Neural Networks - [Repo]

Resources #77.png


Benjamin Meier of Zurich University has written a report on infinite deep neural networks, which describes a meta-layer for neural networks:

"This meta-layer contains an infinite amount of sub-layers, where the network can decide at the training time how many of these layers should be used." [source]

The neural network can be optimized using gradient descent methods. The code and the library for and from this paper is available and open-sourced on github:

"The repository contains a small library that allows it to use the described meta-layer. The library is based on Keras. The library is very minimal, so not all network architectures may be created with it." [source]

It also shows how to create a basic model of the network and a few experiments so that we can get a better understanding of the network. One of these experiments receives 8 binary inputs and computes the XOR result of them, while the other experiment uses the well known MNIST.

Thinking of the practical applications of this type of architecture, I was reading through the report and I understood that the author is likely not going to keep working to develop it for reasons of time constraints, but he may be experimenting with similar models.

The author is however hopeful that someone else might be willing to pick up the work and/or ideas from it and possibly continue developing it further. You can find the library, the report, and the experiments at the following repo:

Very Deep (Infinite) Neural Networks - [Repo]


To stay in touch with me, follow @cristi


Cristi Vlad Self-Experimenter and Author

H2
H3
H4
3 columns
2 columns
1 column
Join the conversation now
Logo
Center