Deep learning with Keras : implement neural networks with Keras on Theano and TensorFlow
Enregistré dans:
Auteur principal: | |
---|---|
Autres auteurs: | |
Support: | E-Book |
Langue: | Anglais |
Publié: |
Birmingham :
Packt Publishing.
C 2017.
|
Sujets: | |
Autres localisations: | Voir dans le Sudoc |
Accès en ligne: | Accès à l'E-book |
Table des matières:
- Cover; Copyright; Credits; About the Authors; About the Reviewer; www.PacktPub.com; Customer Feedback; Table of Contents; Preface; Chapter 1: Neural Networks Foundations; Perceptron; The first example of Keras code; Multilayer perceptron
- the first example of a network; Problems in training the perceptron and a solution; Activation function
- sigmoid; Activation function
- ReLU; Activation functions; A real example
- recognizing handwritten digits; One-hot encoding
- OHE; Defining a simple neural net in Keras; Running a simple Keras net and establishing a baseline
- Improving the simple net in Keras with hidden layersFurther improving the simple net in Keras with dropout; Testing different optimizers in Keras; Increasing the number of epochs; Controlling the optimizer learning rate; Increasing the number of internal hidden neurons; Increasing the size of batch computation; Summarizing the experiments run for recognizing handwritten charts; Adopting regularization for avoiding overfitting; Hyperparameters tuning; Predicting output; A practical overview of backpropagation; Towards a deep learning approach; Summary; Chapter 2: Keras Installation and API
- Installing KerasStep 1
- install some useful dependencies; Step 2
- install Theano; Step 3
- install TensorFlow; Step 4
- install Keras; Step 5
- testing Theano, TensorFlow, and Keras; Configuring Keras; Installing Keras on Docker; Installing Keras on Google Cloud ML; Installing Keras on Amazon AWS; Installing Keras on Microsoft Azure; Keras API; Getting started with Keras architecture; What is a tensor?; Composing models in Keras; Sequential composition; Functional composition; An overview of predefined neural network layers; Regular dense; Recurrent neural networks
- simple, LSTM, and GRU
- Convolutional and pooling layersRegularization; Batch normalization; An overview of predefined activation functions; An overview of losses functions; An overview of metrics; An overview of optimizers; Some useful operations; Saving and loading the weights and the architecture of a model; Callbacks for customizing the training process; Checkpointing; Using TensorBoard and Keras; Using Quiver and Keras; Summary; Chapter 3: Deep Learning with ConvNets; Deep convolutional neural network
- DCNN; Local receptive fields; Shared weights and bias; Pooling layers; Max-pooling; Average pooling
- ConvNets summaryAn example of DCNN
- LeNet; LeNet code in Keras; Understanding the power of deep learning; Recognizing CIFAR-10 images with deep learning; Improving the CIFAR-10 performance with deeper a network; Improving the CIFAR-10 performance with data augmentation; Predicting with CIFAR-10; Very deep convolutional networks for large-scale image recognition; Recognizing cats with a VGG-16 net; Utilizing Keras built-in VGG-16 net module; Recycling pre-built deep learning models for extracting features; Very deep inception-v3 net used for transfer learning; Summary