User:Enricobertino: Difference between revisions

Jump to navigation Jump to search
1,513 bytes added ,  27 March 2017
 
Line 86: Line 86:
*:  
*:  
* Please provide a rough estimated timeline for your work on the task.
* Please provide a rough estimated timeline for your work on the task.
*: '''05/05 - 30/05''' (Before beginning)
*: '''05/05 - 30/05''' (community bonding period)
*::# Familiarize with the community and tools like Mercurial and autotools
*:: Week1
*::# Familiarize with Pytave and test it with basic scripts, object passing and libraries linking.  
*::: - Begin to stay in contact and familiarize with the community, using both the mailing list and the IRC channel.
*::# Test a basic net with Tensorflow
*::: - Improve expertise with tools like Mercurial and autotools
*::: - Read Matlab doc of Neural Network Toolbox classes and basic functions, with a focus on the deep learning part about CNNs
*:: Week2
*::: - Deeply analyze both Python and C++ Tensorflow APIs in order to figure out the best path to follow
*::: - Install and run Pytave. Read the doc if provided or exchange with the maintainers 
*:: Week3
*::: - Test Pytave and figure out there is some bug or missing feature of the specific part that we need.
*::: - Fix potential bugs or submit new patches
*::: - Figure out if we need some object programming in Octave (like classdef) and test it
*: '''30/05 - 30/06''' (Phase 1)
*: '''30/05 - 30/06''' (Phase 1)
*::# Class definitions in Python  
*:: Week1,2 -  Work on the makefile in order to link TF in either Python or C++ and test soma basic nets with TF
*::# Interface definition in Octave
*:: Week3,4 - Write all the Octave classes for every layer and use corresponding TF functions. Because of the focus on Matlab Nnet Toolbox, we will start to define the fundamental layers used for CNNs : Convolutional layer, ReLU layer, Normalization layer, Average pooling layer object, Max pooling layer, Fully connected layer, Dropout layer, Softmax layer, Classification output layer, Regression output layer
*:: Week5 - Implement a draft of the training functions (seriesNetwork object, trainNetwork, trainingOptions) without all options and parameters
*: '''01/07 - 24/07''' (Phase 2)
*: '''01/07 - 24/07''' (Phase 2)
*::# Write unit tests
*:: Week1-2  - Finish the implementation and testing of the training functions.
*::# Methods and events definition
*:: Week3 - Able the parallelization and analyze CUDA integration
*::# Matlab compatibility improvement
*::# Parallelization
*: '''28/07 - 21/08''' (Final phase)
*: '''28/07 - 21/08''' (Final phase)
*::# Write the documentation and make more tests
*:: Week1-2  Implementation of more advanced nets like (AlexNet, vgg16, vgg19) and cool application like deepDreamImage
*: This is obviously a draft of the timeline. All the steps will become clearer with some discussions with the mentors.
*:: Week3 Try to implement ImageDatastore in order to manage seamless the image import
*: All tests and documentation will be written during the whole period, simultaneously with every function. Both BIST tests and doc tests will be written, along with Python unit tests.
*: All the steps will become clearer with some discussions with the mentors.


[[Category: Summer of Code]]
[[Category: Summer of Code]]

Navigation menu