Readings – Deep Learning – (Slightly) More Advanced
What to Read and Study Once You Know a Little
Convolutional Neural Networks (CNNs): Image Processing
- Shrivastava, P. (Aug. 12, 2017), Must-read Path-breaking Papers About Image Classification AJM’s Note: This one is really good; it covers each of the major image processing / CNN architectures as they’ve evolved over the past several years, and gives links to the core papers.
YouTubes on CNNs:
- Geoffrey Hinton, on “What’s Wrong with CNNs”. AJM’s Note: On my to-watch list, for when I have an hour … lots to do with capsules, Hough transform, etc.
This figure identifies some of the convolutional neural networks described in the preceding paper.
The Boltzmann Machine
- Maren, A.J. (1989). A Tutorial on the Boltzmann Distribution and Energy Minimization for Neural Network Ensembles, White Paper, University of Tennessee Space Institute. DOI: 10.13140/2.1.3044.7685 pdf. AJM’s Note: Much of the Deep Learning work builds on the Boltzmann machine. The Boltzmann machine uses statistical thermodynamics, and minimizes a free energy function. (That’s how it learns each different pattern.) If you do not already know and understand statistical thermodynamics, this White Paper (Tutorial) gives you enough background to understand the Boltzmann machine algorithm.
Important, Even If It’s Not Deep Learning
AJM’s Note: Deep learning, and the build-up of neural networks (piled higher and deeper) is the current craze. Let’s pull focus. Deep learning methods are largely bottom-up, even if a combination of supervised and unsupervised layers. They deal with going from smallest-to-larger data elements (representations). At some point, we need to get out of the granular and get towards the symbolic; ontologies, knowledge graphs (as per Google), etc.
To do this, we need a bridging representation. Development of these bridging representations, using perceptual processes, was a key to getting computer vision to work. (See David Marr’s 2 1/2-D representation work.)
John Sowa and Charles Peirce are two magnificent thinkers who have contributed in a major way to this kind of “bridging representation.” Even if you don’t read about them in depth, it is REALLY GOOD to know about them in general, and what they’ve done, and how their work fits it. This is very important for broader-scope AI.
- Mike Bergman, Why I study C. S. Peirce.