Charu C. Aggarwal: Neural Networks and Deep Learning
Neural Networks and Deep Learning
Buch
- A Textbook
lieferbar innerhalb 2-3 Wochen
(soweit verfügbar beim Lieferanten)
(soweit verfügbar beim Lieferanten)
EUR 53,32*
Verlängerter Rückgabezeitraum bis 31. Januar 2025
Alle zur Rückgabe berechtigten Produkte, die zwischen dem 1. bis 31. Dezember 2024 gekauft wurden, können bis zum 31. Januar 2025 zurückgegeben werden.
- Springer International Publishing, 07/2024
- Einband: Kartoniert / Broschiert, Paperback
- Sprache: Englisch
- ISBN-13: 9783031296444
- Bestellnummer: 11909102
- Umfang: 556 Seiten
- Nummer der Auflage: 24002
- Auflage: Second Edition 2023
- Gewicht: 1032 g
- Maße: 254 x 178 mm
- Stärke: 30 mm
- Erscheinungstermin: 1.7.2024
Achtung: Artikel ist nicht in deutscher Sprache!
Weitere Ausgaben von Neural Networks and Deep Learning
Klappentext
This book covers both classical and modern models in deep learning. The primary focus is on the theory and algorithms of deep learning. The theory and algorithms of neural networks are particularly important for understanding important concepts, so that one can understand the important design concepts of neural architectures in different applications. Why do neural networks work? When do they work better than off-the-shelf machine-learning models? When is depth useful? Why is training neural networks so hard? What are the pitfalls? The book is also rich in discussing different applications in order to give the practitioner a flavor of how neural architectures are designed for different types of problems. Deep learning methods for various data domains, such as text, images, and graphs are presented in detail. The chapters of this book span three categories:The basics of neural networks: The backpropagation algorithm is discussed in Chapter 2.
Many traditional machine learning models can be understood as special cases of neural networks. Chapter 3 explores the connections between traditional machine learning and neural networks. Support vector machines, linear / logistic regression, singular value decomposition, matrix factorization, and recommender systems are shown to be special cases of neural networks.
Fundamentals of neural networks: A detailed discussion of training and regularization is provided in Chapters 4 and 5. Chapters 6 and 7 present radial-basis function (RBF) networks and restricted Boltzmann machines.
Advanced topics in neural networks: Chapters 8, 9, and 10 discuss recurrent neural networks, convolutional neural networks, and graph neural networks. Several advanced topics like deep reinforcement learning, attention mechanisms, transformer networks, Kohonen self-organizing maps, and generative adversarial networks are introduced in Chapters 11 and 12.
The textbook is written for graduate students and upper under graduate level students. Researchers and practitioners working within this related field will want to purchase this as well.
Where possible, an application-centric view is highlighted in order to provide an understanding of the practical uses of each class of techniques. The second edition is substantially reorganized and expanded with separate chapters on backpropagation and graph neural networks. Many chapters have been significantly revised over the first edition.
Greater focus is placed on modern deep learning ideas such as attention mechanisms, transformers, and pre-trained language models.