HOME > Detail View

Detail View

Deep learning with python

Deep learning with python (Loan 15 times)

Material type
단행본
Personal Author
Chollet, François.
Title Statement
Deep learning with python / François Chollet.
Publication, Distribution, etc
Shelter Island, NY :   Manning,   c2018.  
Physical Medium
xxi, 361 p. : ill. ; 24 cm.
ISBN
9781617294433
General Note
Includes index.  
Subject Added Entry-Topical Term
Python (Computer program language). Machine learning. Neural networks (Computer science).
000 00000nam u2200205 a 4500
001 000045935549
005 20180315110141
008 180314s2018 nyua 001 0 eng d
020 ▼a 9781617294433
035 ▼a (KERIS)BIB000014702280
040 ▼a 211048 ▼c 211048 ▼d 211009
082 0 4 ▼a 006.31 ▼2 23
084 ▼a 006.31 ▼2 DDCK
090 ▼a 006.31 ▼b C547d
100 1 ▼a Chollet, François.
245 1 0 ▼a Deep learning with python / ▼c François Chollet.
260 ▼a Shelter Island, NY : ▼b Manning, ▼c c2018.
300 ▼a xxi, 361 p. : ▼b ill. ; ▼c 24 cm.
500 ▼a Includes index.
650 0 ▼a Python (Computer program language).
650 0 ▼a Machine learning.
650 0 ▼a Neural networks (Computer science).
945 ▼a KLPA

Holdings Information

No. Location Call Number Accession No. Availability Due Date Make a Reservation Service
No. 1 Location Main Library/Western Books/ Call Number 006.31 C547d Accession No. 111787697 Availability Available Due Date Make a Reservation Service B M
No. 2 Location Main Library/Western Books/ Call Number 006.31 C547d Accession No. 111810246 Availability Available Due Date Make a Reservation Service B M

Contents information

Author Introduction

프랑소와 숄레(지은이)

캘리포니아 마운틴 뷰의 구글에서 딥러닝과 관련된 일을 한다. 케라스 딥러닝 라이브러리의 창시자이고 텐서플로 머신 러닝 프레임워크의 기여자다. 컴퓨터 비전과 형식 추론을 위한 머신 러닝 애플리케이션에 초점을 맞춰 딥러닝을 연구한다. 그의 논문은 CVPR(Computer Vision and Pattern Recognition), NIPS(Neural Information Processing Systems), ICLR(International Conference on Learning Representations) 등의 주요 콘퍼런스와 워크숍에서 소개되었다.

Information Provided By: : Aladin

Table of Contents

Preface	p. xiii
Acknowledgments	p. xv
About this book	p. xvi
About the author	p. xx
About the cover	p. xxi
Part 1	Fundamentals of deep learning	p. 1
1	    What is deep learning?	p. 3
1.1	        Artificial intelligence, machine learning, and deep learning	p. 4
            Artificial intelligence	p. 4
            Machine learning	p. 4
            Learning representations from data	p. 6
            The "deep" in deep learning	p. 8
            Understanding how deep learning works, in three figures	p. 9
            What deep learning has achieved so far	p. 11
            Don''t believe the short-term hype	p. 12
            The promise of AI	p. 13
1.2	        Before deep learning: a brief history of machine learning	p. 14
            Probabilistic modeling	p. 14
            Early neural networks	p. 14
            Kernel methods	p. 15
            Decision trees, random forests, and gradient boosting machines	p. 16
            Back to neural networks	p. 17
            What makes deep learning different	p. 17
            The modern machine-learning landscape	p. 18
1.3	        Why deep learning? Why now?	p. 20
            Hardware	p. 20
            Data	p. 21
            Algorithms	p. 21
            A new wave of investment	p. 22
            The democratization of deep learning	p. 23
            Will it last?	p. 23
2	    Before we begin: the mathematical building blocks of neural networks	p. 25
2.1	        A first look at a neural network	p. 27
2.2	        Data representations for neural networks	p. 31
            Scalars (0D tensors)	p. 31
            Vectors (1D tensors)	p. 31
            Matrices (2D tensors)	p. 31
            3D tensors and higher-dimensional tensors	p. 32
            Key attributes	p. 32
            Manipulating tensors in Numpy	p. 34
            The notion of data batches	p. 34
            Real-world examples of data tensors	p. 35
            Vector data	p. 35
            Timeseries data or sequence data	p. 35
            Image data	p. 36
            Video data	p. 37
2.3	        The gears of neural networks: tensor operations	p. 38
            Element-wise operations	p. 38
            Broadcasting	p. 39
            Tensor dot	p. 40
            Tensor reshaping	p. 42
            Geometric interpretation of tensor operations	p. 43
            A geometric interpretation of deep learning	p. 44
2.4	        The engine of neural networks: gradient-based optimization	p. 46
            What''s a derivative?	p. 47
            Derivative of a tensor operation: the gradient	p. 48
            Stochastic gradient descent	p. 48
            Chaining derivatives: the Backpropagation algorithm	p. 51
2.5	        Looking back at our first example	p. 53
2.6	        Chapter summary	p. 55
3	    Getting started with neural networks	p. 56
3.1	        Anatomy of a neural network	p. 58
            Layers: the building blocks of deep learning	p. 58
            Models: networks of layers	p. 59
            Loss functions and optimizers: keys to configuring the learning process	p. 60
3.2	        Introduction to Keras	p. 61
            Keras, TensorFlow, Theano, and CNTK	p. 62
            Developing with Keras: a quick overview	p. 62
3.3	        Setting up a deep-learning workstation	p. 65
            Jupyter notebooks: the preferred way to run deep-learning experiments	p. 65
            Getting Keras running: two options	p. 66
            Running deep-learning jobs in the cloud: pros and cons	p. 66
            What is the best GPU for deep learning?	p. 66
3.4	        Classifying movie reviews: a binary classification example	p. 68
            The IMDB dataset	p. 68
            Preparing the data	p. 69
            Building your network	p. 70
            Validating your approach	p. 73
            Using a trained network to generate predictions on new data	p. 76
            Further experiments	p. 77
            Wrapping up	p. 77
3.5	        Classifying newswires: a multiclass classification example	p. 78
            The Reuters dataset	p. 78
            Preparing the data	p. 79
            Building your network	p. 79
            Validating your approach	p. 80
            Generating predictions on new data	p. 83
            A different way to handle the labels and the loss	p. 83
            The importance of having sufficiently large intermediate layers	p. 83
            Further experiments	p. 84
            Wrapping up	p. 84
3.6	        Predicting house prices: a regression example	p. 85
            The Boston Housing Price dataset	p. 85
            Preparing the data	p. 86
            Building your network	p. 86
            Validating your approach using K-fold validation	p. 87
            Wrapping up	p. 91
3.7	        Chapter summary	p. 92
4	    Fundamentals of machine learning	p. 93
4.1	        Four branches of machine learning	p. 94
            Supervised learning	p. 94
            Unsupervised learning	p. 94
            Self-supervised learning	p. 94
            Reinforcement learning	p. 95
4.2	        Evaluating machine-learning models	p. 97
            Training validation, and test sets	p. 97
            Things to keep in mind	p. 100
4.3	        Data preprocessing, feature engineering, and feature learning	p. 101
            Data preprocessing for neural networks	p. 101
            Feature engineering	p. 102
4.4	        Overfitting and underfitting	p. 104
            Reducing the network''s size	p. 104
            Adding weight regularization	p. 107
            Adding dropout	p. 109
4.5	        The universal workflow of machine learning	p. 111
            Defining the problem and assembling a dataset	p. 111
            Choosing a measure of success	p. 112
            Deciding on an evaluation protocol	p. 112
            Preparing your data	p. 112
            Developing a model that does better than a baseline	p. 113
            Scaling up: developing a model that overfits	p. 114
            Regularizing your model and luning your hyperparameters	p. 114
4.6	        Chapter summary	p. 116
Part 2	Deep Learning in Practice	p. 117
5	    Deep learning for computer vision	p. 119
5.1	        Introduction to convnets	p. 120
            The convolution operation	p. 122
            The max-pooling operation	p. 127
5.2	        Training a convnet from scratch on a small dataset	p. 130
            The relevance of deep learning for small-data problems	p. 130
            Downloading the data	p. 131
            Building your network	p. 133
            Data preprocessing	p. 135
            Using data augmentation	p. 138
5.3	        Using a pretrained convnet	p. 143
            Feature extraction	p. 143
            Fine-luning	p. 152
            Wrapping up	p. 159
5.4	        Visualizing what convnets learn	p. 160
            Visualizing intermediate activations	p. 160
            Visualizing convnet filters	p. 167
            Visualizing heatmaps of class activation	p. 172
5.5	        Chapter summary	p. 177
6	    Deep learning for text and sequences	p. 178
6.1	        Working with text data	p. 180
            One-hot encoding of words and characters	p. 181
            Using word embeddings	p. 184
            Putting it all together: from raw text to word embeddings	p. 188
            Wrapping up	p. 195
6.2	        Understanding recurrent neural networks	p. 196
            A recurrent layer in Keras	p. 198
            Understanding the LSTM and GRU layers	p. 202
            A concrete LSTM example in Keras	p. 204
            Wrapping up	p. 206
6.3	        Advanced use of recurrent neural networks	p. 207
            A temperature-forecasting problem	p. 207
            Preparing the data	p. 210
            A common-sense, non-machine-learning baseline	p. 212
            A basic machine-learning approach	p. 213
            A first recurrent baseline	p. 215
            Using recurrent dropout to fight overfitting	p. 216
            Stacking recurrent layers	p. 217
            Using bidirectional RNNs	p. 219
            Going even further	p. 222
            Wrapping up	p. 223
6.4	        Sequence processing with convnets	p. 225
            Understanding 1D convolution for sequence data	p. 225
            1D pooling for sequence data	p. 226
            Implementing a 1D convnet	p. 226
            Combining CNNs and RNNs to process long sequences	p. 228
            Wrapping up	p. 231
6.5	        Chapter summary	p. 232
7	    Advanced deep-learning best practices	p. 233
7.1	        Going beyond the Sequential model: the Keras functional API	p. 234
            Introduction to the functional API	p. 236
            Multi-input models	p. 238
            Multi-output models	p. 240
            Directed acyclic graphs of layers	p. 242
            Layer weight sharing	p. 246
            Models as layers	p. 247
            Wrapping up	p. 248
7.2	        Inspecting and monitoring deep-learning models using Keras callbacks and TensorBoard	p. 249
            Using callbacks to act on a model during training	p. 249
            Introduction to TensorBoard: the TensorFlow visualization framework	p. 252
            Wrapping up	p. 259
7.3	        Getting the most out of your models	p. 260
            Advanced architecture patterns	p. 260
            Hyperparameter optimization	p. 263
            Model ensembling	p. 264
            Wrapping up	p. 266
7.4	        Chapter summary	p. 268
8	    Generative deep learning	p. 269
8.1	        Text generation with LSTM	p. 271
            A brief history of generative recurrent networks	p. 271
            How do you generate sequence data?	p. 272
            The importance of the sampling strategy	p. 272
            Implementing character-level LSTM text generation	p. 274
            Wrapping up	p. 279
8.2	        DeepDream	p. 280
            Implementing DeepDream in Keras	p. 281
            Wrapping up	p. 286
8.3	        Neural style transfer	p. 287
            The content loss	p. 288
            The style loss	p. 288
            Neural style transfer in Keras	p. 289
            Wrapping up	p. 295
8.4	        Generating images with variational autoencoders	p. 296
            Sampling from latent spaces of images	p. 296
            Concept vectors for image editing	p. 297
            Variational autoencoders	p. 298
            Wrapping up	p. 304
8.5	        Introduction to generative adversarial networks	p. 305
            A schematic GAN implementation	p. 307
            A bag of tricks	p. 307
            The generator	p. 308
            The discriminator	p. 309
            The adversarial network	p. 310
            How to train your DCGAN	p. 310
            Wrapping up	p. 312
8.6	        Chapter summary	p. 313
9	    Conclusions	p. 314
9.1	        Key concepts in review	p. 315
            Various approaches to AI	p. 315
            What makes deep learning special within the field of machine learning	p. 315
            How to think about deep learning	p. 316
            Key enabling technologies	p. 317
            The universal machine-learning workflow	p. 318
            Key network architectures	p. 319
            The space of possibilities	p. 322
9.2	        The limitations of deep learning	p. 325
            The risk of anthropomorphizing machine-learning models	p. 325
            Local generalization vs. extreme generalization	p. 327
            Wrapping up	p. 329
9.3	        The future of deep learning	p. 330
            Models as programs	p. 330
            Beyond backpropagation and differentiable layers	p. 332
            Automated machine learning	p. 332
            Lifelong learning and modular subroutine reuse	p. 333
            The long-term vision	p. 335
9.4	        Staying up to date in a fast-moving field	p. 337
            Practice on real-world problems using Kaggle	p. 337
            Read about the latest developments on arXiv	p. 337
            Explore the Kerns ecosystem	p. 338
9.5	        Final words	p. 339
Appendix A	Installing Keras and its dependencies on Ubuntu	p. 340
Appendix B	Running Jupyter notebooks on an EC2 GPU instance	p. 345
Index	p. 353

New Arrivals Books in Related Fields

National Academies of Sciences, Engineering, and Medicine (U.S.) (2020)
Cartwright, Hugh M. (2021)
한국소프트웨어기술인협회. 빅데이터전략연구소 (2021)