Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
More flexible machine learning improves image classification (news.mit.edu)
22 points by gruez on Oct 3, 2015 | hide | past | favorite | 6 comments


The paper described in the article: http://arxiv.org/pdf/1506.05439v1.pdf


So is this just the k-nearest neighbour approach with the Wasserstein distance as the measure? Is it the efficient implementation of the Wasserstein distance that makes this outcome interesting? Or is it the use of a non-parametric model in image classification?


Previous models had a loss function where you are either right or wrong. This model uses the similarity of concepts.


Could this approach benefit from using Word2Vec?


The way I read the article (have t read the paper) the methods are very similar in concept, in that they are using word embeddings.

Basically instead of training on Wikipedia, they train on the categories assigned by Flickr users. And using a different word embedding algorithm than word2vec does, but the principal is similar imho


Maybe. Phrase tags ("Siberian huskie") could be a problem, and you'd use a different distance measure.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: