The other day, Daniel Lemire posted a comment extolling Peter Turney as someone who does a great job blogging about his research. His blog, Apperceptual, is one of the highest-quality blogs I’ve seen in the information retrieval community.
Turney is a Research Officer at Canada’s National Research Council (NRC) Institute for Information Technology. His two decades of research cover a broad spectrum of topics in machine learning, information retri…
The other day, Daniel Lemire posted a comment extolling Peter Turney as someone who does a great job blogging about his research. His blog, Apperceptual, is one of the highest-quality blogs I’ve seen in the information retrieval community.
Turney is a Research Officer at Canada’s National Research Council (NRC) Institute for Information Technology. His two decades of research cover a broad spectrum of topics in machine learning, information retrieval, and computational linguistics. Moreover, the practial orientation of the NRC helps ensure that Peter’s scholarly work is grounded in the real-world.
The best way to get a feeling for Turney’s blog is to read it. Here are a few posts I’d suggest:
- A Uniform Approach to Analogies, Synonyms, Antonyms, and Associations
- SVD, Variance, and Sparsity
- Beyond Proportional Analogy
This last post, published today, offers a promising approach towards establishing analogies as the central problem in a theory of semantics. Or, as Turney quotes Douglas Hofstadter, that “all meaning comes from analogies”.
Turney’s writing isn’t always so heavy. In fact, two of his most popular posts are “Open Problems” and “How to Maximize Citations“, both of which I’d recommend to aspiring researchers.
Turney doesn’t crank out blog posts daily or even weekly–he sometimes goes for over a month between posts. But what he does write is well worth reading.