We do Deep-Learning.
Dec 12, 2016 • Lyndon Maydwell
Here’s a collection of some recent machine-learning, artificial-intelligence and software-engineering papers, posts, and press-releases that have caught our attention. As always, if some of these posts weren’t written recently, then they were at least recently discovered by us!
Note: Updates to this post are still happening as December isn’t over yet :)
If you’re looking hear about some recent papers on quantum machine learning then you’ve come to the right place! Noon covers three recent papers in this post that combine the two very cutting edge areas of research:
- Quantum generalisation of feedforward neural networks by Wan, Dahlsten, Kristjánsson, Gardner and Kim.
- Quantum gradient descent and Newton’s method for constrained polynomial optimization by Rebentrost, Schuld, Petruccione and Lloyd,
- Quantum autoencoders for efficient compression of quantum data by Romero, Olson and Aspuru-Guzik,
Although this may sound like a double-niche area of research, both areas are very active right now, and the benefits of combining the two could be huge.
The abstract says it best:
Online content publishers often use catchy headlines for their articles in order to attract users to their websites… Here, we introduce a neural network architecture based on Recurrent Neural Networks for detecting clickbaits. Our model relies on distributed word representations learned from a large unannotated corpora, and character embeddings learned via Convolutional Neural Networks…
Results are very good!
In the vein of, and building on many if the ideas in Structure and Interpretation of Classical Mechanics, “Learn Physics by Programming in Haskell” builds up many Newtonian-physics principles and primitives in the Haskell programming language. This is not intended for physicists, or for Haskell programmers, but for beginners in both fields, as a duality to increase an understanding of both! So… if you have Newtonian-mechanics and Haskell on your TO-LEARN list, then you’ve certainly stumbled across a local-maximum here.