I plan to work on a daily habit on this project expending at least 35 hours a week. I may take part of a summer school on Machine Learning here in Spain in June, but that would only be five days and I can bring with me my laptop. This is the schedule I would like to follow:
Before 30 May:
I want to be more familiarised with the community and the mentors. I would like to contribute as much as I can to the nnet forge package. I plan to check if it is possible to reuse code from such package, since it is related to neural networks. In addition, we should study how the Matlab CNN toolbox works in depth so we can have a great modulated and configurable design for our package so it can be compatible before coding. It is critic finding if Pytave offers enough functionality to use Tensorflow (or even Keras) from the Python interface.
Phase 1, until 30 June:
Write the basic layers (image input layer, convolutional layer, RELU layer, max/avg pooling layer, fully connected layer and dropout layer), test their functionality and check their compatibility with Matlab code.
Phase 2, until 28 July:
Write the rest missing layers like classification, softmax and regression layers. Write the main CNN class, that holds the layers, its training process and the classify method. In this point we should be able to train some networks and test their functionality. Once again, I would check its compatibility with the Matlab toolbox.
Phase 3, until 29 August:
Add extra features like the activations class, that can display a layer activations, and the possibility to load pretrainned networks. We could start by defining the AlexNET and VGG16 networks but this could be extended with an exporting/importing tool to unload/load CNNs' architecture and weights in files. Add documented use cases to create a tutorial.
As for testing the layers, I plan to write black-box unity tests previously to their implementation. That is writing small pieces of code that check that given an input to the layer, an expected output is returned by it. This would be useful to check two things: the implementation is correct (we are calling the Tensorflow API properly) and it is also compatible with the Matlab's CNN toolbox because we could make them have the same behaviour.
Since we will be running Tensorflow code under the surface, we could go a step forward and use the Tensorboard utility to check some of these metrics or even ensure with the graph board that the displayed network corresponds to the defined architecture in the m-code. This utility could be enabled for the Octave package.
Through the whole process I plan to keep the code well documented using Doxygen. In case we manage to use the Tensorflow's Python interface, we could look for another way of doing it (like Sphinx).
[[Category: Summer of Code]]