ResNets using Gradient boosted blocks of 2-Layer Neural Network
- 0 Collaborators
Investigating the significance of the change in representation in subsequent layers of a deep network. Also analyzing the significance of joint and individual training of these layers. ...learn more
Project status: Under Development
Intel Technologies
Intel Opt ML/DL Framework,
Movidius NCS
Overview / Usage
An alternate way to reconstruct ResNets
Methodology / Approach
Difference between ResNets and gradient boosting methods is while gradient boosting directly updates the predictor, ResNets iteratively optimize the feature extraction by stacking ResNet layers rather than the predictor, according to the existing work. We are studying to construct the resnets by replacing the residual blocks with gradient boosted weak models
Technologies Used
Keras, Tensorflow, Movidius compute Neural Stick