So the 25 page chapter is further broken down into introduction and 6 independent sections. Lets sample the delicacies.
Introduction: The authors claim NNs "shook up the statistics community" in the 1980s, the NN literature is very "colorful", the statistician's response was the "knee-jerk What's the big deal ?" dismissal, but NNs started "solving problems on a scale far exceeding what the statistics community was used to". This led to new journals and "several popular conferences at ski resorts". Then the NNs died a brief death in mid 1990s but "reemerged with a vengeance after 2010", and that's what's called Deep Learning.
Generally, you don't see textbooks taking swipes at whole domains, with flippant commentary as a bonus. So it was fun reading the intro. I think more books should be written in this sort of breezy fashion. It gets the reader firmly hooked.
As far as technical material goes, the intro had the usual feedforward network explained in a single line, with the associated equations.
The takeaway from the Intro is mostly that NNs are mostly "just a nonlinear model", but their influence is because of the topology ("they can be scaled up and generalized in a variety of ways...many units in a layer...many layers...weight sharing....colorful forms of regularization (again that word - colorful! ). They have found their "ideal niche", which is image classification & NLP. Their success is due to "massive improvements in computer resources". I imagine the ski resort conferences don't hurt.