更新时间:2021-07-02 14:11:25
coverpage
Title Page
Copyright and Credits
Intelligent Projects Using Python
About Packt
Why subscribe?
Packt.com
Contributors
About the author
About the reviewer
Packt is searching for authors like you
Preface
Who this book is for
What this book covers
To get the most out of this book
Download the example code files
Download the color images
Code in action
Conventions used
Get in touch
Reviews
Foundations of Artificial Intelligence Based Systems
Neural networks
Neural activation units
Linear activation units
Sigmoid activation units
The hyperbolic tangent activation function
Rectified linear unit (ReLU)
The softmax activation unit
The backpropagation method of training neural networks
Convolutional neural networks
Recurrent neural networks (RNNs)
Long short-term memory (LSTM) cells
Generative adversarial networks
Reinforcement learning
Q-learning
Deep Q-learning
Transfer learning
Restricted Boltzmann machines
Autoencoders
Summary
Transfer Learning
Technical requirements
Introduction to transfer learning
Transfer learning and detecting diabetic retinopathy
The diabetic retinopathy dataset
Formulating the loss function
Taking class imbalances into account
Preprocessing the images
Additional data generation using affine transformation
Rotation
Translation
Scaling
Reflection
Additional image generation through affine transformation
Network architecture
The VGG16 transfer learning network
The InceptionV3 transfer learning network
The ResNet50 transfer learning network
The optimizer and initial learning rate
Cross-validation
Model checkpoints based on validation log loss
Python implementation of the training process
Dynamic mini batch creation during training
Results from the categorical classification
Inference at testing time
Performing regression instead of categorical classification
Using the keras sequential utils as generator
Neural Machine Translation
Rule-based machine translation
The analysis phase
Lexical transfer phase
Generation phase
Statistical machine-learning systems
Language model
Perplexity for language models
Translation model
Neural machine translation
The encoder–decoder model
Inference using the encoder–decoder model
Implementing a sequence-to-sequence neural translation machine
Processing the input data
Defining a model for neural machine translation
Loss function for the neural translation machine
Training the model
Building the inference model
Word vector embeddings
Embeddings layer
Implementing the embeddings-based NMT
Style Transfer in Fashion Industry using GANs
DiscoGAN
CycleGAN
Learning to generate natural handbags from sketched outlines
Preprocess the Images
The generators of the DiscoGAN
The discriminators of the DiscoGAN