[<< | Prev | Index | Next | >>]

Wednesday, October 30, 2002

The Elements of Statistical Learning



I keep an eye on the AI-related talks at the universities and institutes around here, and once in a while one catches my eye enough to get me out of my cave to see it. Most recently, it was this one (at UCSD):

BRUNO OLSHAUSEN    

Center for Neuroscience
University of California, Davis

"Sparse Coding of Time-Varying Natural Images"

Abstract:

The images that fall upon our retinae contain certain statistical
regularities over space and time. In this talk I will discuss a method for
modeling this structure based upon sparse coding in time.  When adapted to
time-varying natural images, the model develops a set of space-time
receptive fields similar to those of simple-cells in the primary visual
cortex.  A continuous image sequence is thus re-represented in terms of a
set of punctate, spike-like events in time. The suggestion is that *both*
the  receptive fields of V1 neurons and the spiking nature of neural
activity go hand in hand---i.e., they are part of a coordinated strategy for
producing sparse representations of sensory data.

My intuition was vindicated when I tried to arrange lunch with Rageesh around this talk and learned that his (related) class had been canceled in favor of going to this talk. Made the logistics easy. And the talk was indeed quite good--perhaps the first intuitively palpable theory I've seen addressing neuronal spike timing, among other things.

And the lunch was quite good too: chicken in a pomegranate and walnut sauce at the Persian place near campus. Yum. It's a regular special there, and I went hoping to find it.

People ask me periodically if I know any good AI books, for someone new to the field but who wants to actually learn the stuff and not just talk about it at futurist parties. Unfortunately, most of the literature either falls into the coffee table (popular science) category, or consists of an endless stream of cryptic papers coming out of universities and research institutes which all take special pride in their use of opaque mathematics, terminology, and references to prior papers which only a grad student would have easy access to.

But Rageesh recently found a great book, which I ended up buying a copy of from the campus bookstore on the way back, called The Elements of Statistical Learning. I've only just started reading it, but having skimmed through the topics and spot-checked style and presentation, this looks like a great college-level treatment of some of the key concepts of modern AI (just some, but I can see this being one of a few books to have). With the caveat that I haven't actually read it cover to cover yet, I tentatively recommend it to anyone seriously interested in entering the field. It does look a wee bit mathy in parts, but not needlessly so, and I have a feeling that if read in order it should be quite accessible to anyone who could pass a mid-level undergraduate physics course.

Title: The Elements of Statistical Learning: Data Mining, Inference, and Prediction
Author: Trevor Hastie
Joint Author: Jerome Friedman
Format: Hardcover
ISBN: 0387952845
Pages: 520
Publish Date: 9/1/2001
Publisher: Springer-Verlag TELOS


[<< | Prev | Index | Next | >>]


Simon Funk / simonfunk@gmail.com