Congratulaions Cole! Even though we intended to fund projects in Q1 2016, your proposal was so awesome that we just had to get you started right away.

Check out Cole's proposal to do hyperparameter optimization and visualization on dreaming neural nets. And he's already refactored his code for "public consumption." 

Our first coworking session at Floyd's on Sunday was a big success, we came up with a better regularization approach that boosted his accuracy on the Kaggle MINST problem by 30% (accuracy jumped from 92% to 95% with just a partial implementation of the new appraoch). Basically, he realized that 25% random dropout in combination with L2 weight regularization was driving all his weights (and performance) to zero. First he turned off random dropouts (temporarily, until he gets regularization dialed in). Then he switched from L2 norm to L1 (as a first step towards p-norm). I can't wait to see what p-norm does!

And this week he's looking into d3 visualizations of the weights:

  1. heatmap matrices
  2. force-directed graphs
And the real kicker


I can't wait to see what you come up with next.

Thank you Thunder for helping him get started on this awesome project.

Currently unrated
  • Share

Comments