Blog

OpenAI

It looks like Total Good has some competition in the quest to add to the greater good. Elon, Sam, and friends funded OpenAI.org with a vision very similar to ours. I really hope they live up to their name and the reputation of their founders and open source much of their work. If anyone finds any of their projects on github or somewhere "traditionally open-source" let me know. From my search, and the e-mail correspondence below, it seems they are taking it slow.

Continue reading

Warped Time Series

Thunder has been modeling time series from his android phone to do supervised learning. I just found an excellent thesis on a Bag-of-Features approach to time series feature generation/extraction that might help him out a lot: 2012 Baydogan, ASU. And that got me thinking about NLP using word2vec sequences as an ordered set instead of a bag of words/features. It's high time someone brought in the subtleties of word-order (in addition to meaning) to NLP. I bet conventional time series ML (expecially feature extraction) would work great on sequences of 10D or 100D wordvecs.

Continue reading

Congratulations Thunder

We're happy to announce the winner of this year's Total Good Intelligent Machines grant! Thunder was awarded $1000 this week to support a summer of data science work for Hack Oregon. He plans to mine campaign finance data and contributing to Hack Oregon's open source code base. And he'll contribute to Total Good's opensource [machine intelligence utilities](http://github.com/totalgood/pug-ann) and hyperparameter optimization technology. He's already begun contributing by getting the pug-ann documentation in order and getting up to speed on the various Hack Oregon projects.

Continue reading