Thunder has been modeling time series from his android phone to do supervised learning. I just found an excellent thesis on a Bag-of-Features approach to time series feature generation/extraction that might help him out a lot: 2012 Baydogan, ASU. And that got me thinking about NLP using word2vec sequences as an ordered set instead of a bag of words/features. It's high time someone brought in the subtleties of word-order (in addition to meaning) to NLP. I bet conventional time series ML (expecially feature extraction) would work great on sequences of 10D or 100D wordvecs.

Currently unrated
  • Share