It looks like Total Good has some competition in the quest to add to the greater good. Elon, Sam, and friends funded OpenAI.org with a vision very similar to ours. I really hope they live up to their name and the reputation of their founders and open source much of their work. If anyone finds any of their projects on github or somewhere "traditionally open-source" let me know. From my search, and the e-mail correspondence below, it seems they are taking it slow.
Viewing posts by Hobson Lane
Congratulaions Cole! Even though we intended to fund projects in Q1 2016, your proposal was so awesome that we just had to get you started right away.
Thunder has been modeling time series from his android phone to do supervised learning. I just found an excellent thesis on a Bag-of-Features approach to time series feature generation/extraction that might help him out a lot: 2012 Baydogan, ASU. And that got me thinking about NLP using word2vec sequences as an ordered set instead of a bag of words/features. It's high time someone brought in the subtleties of word-order (in addition to meaning) to NLP. I bet conventional time series ML (expecially feature extraction) would work great on sequences of 10D or 100D wordvecs.
We're happy to announce the winner of this year's Total Good Intelligent Machines grant! Thunder was awarded $1000 this week to support a summer of data science work for Hack Oregon. He plans to mine campaign finance data and contributing to Hack Oregon's open source code base. And he'll contribute to Total Good's opensource [machine intelligence utilities](http://github.com/totalgood/pug-ann) and hyperparameter optimization technology. He's already begun contributing by getting the pug-ann documentation in order and getting up to speed on the various Hack Oregon projects.
Zeke, Thunder, Joe and I are comparing the performance of several Hyperparameter Optimization algorithms (i.e. autotuning) for a presentation at the Wolfram Data Summit in September.