Add files using upload-large-folder tool
Browse files- .gitattributes +271 -0
- 10. Data Augmentation/10.1 - 10.3 - Data Augmentation - Cats vs Dogs.ipynb +3 -802
- 10. Data Augmentation/10.4 - Data Augmentation Demos.ipynb +3 -420
- 11. Confusion Matrix and Viewing Misclassifications/11.1 - 11.2 - MNIST Confusion Matrix Analysis and Viewing Misclassifications.ipynb +3 -484
- 12. Optimizers, Adaptive Learning Rate & Callbacks/12.2 Checkpointing Models and Early Stopping.ipynb +3 -277
- 12. Optimizers, Adaptive Learning Rate & Callbacks/12.3 Building a Fruit Classifer.ipynb +0 -0
- 13. Building LeNet and AlexNet in Keras/13.1 Built LeNet and test on MNIST.ipynb +3 -209
- 13. Building LeNet and AlexNet in Keras/13.2 Build AlexNet and test on CIFAR10.ipynb +3 -266
- 13. Building LeNet and AlexNet in Keras/13.4 Fashion MNIST.ipynb +3 -445
- 14. ImageNet and Pretrained Models VGG16_ResNet50_InceptionV3/14.1 Experimenting with pre-trained Models in Keras.ipynb +3 -227
- 15. Transfer Learning & Fine Tuning/15.2 Using MobileNet to make a Monkey Breed Classifier.ipynb +3 -657
- 15. Transfer Learning & Fine Tuning/15.3 Making a Flower Classifier with VGG16.ipynb +3 -693
- 16. Design Your Own CNN - LittleVGG/16.2 LittleVGG - Simpsons.ipynb +0 -0
- 18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/18.2 Building an Emotion Detector with LittleVGG.ipynb +3 -723
- 18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/18.3A - Age, Gender Detection.ipynb +3 -174
- 18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/18.3B Age, Gender with Emotion.ipynb +3 -526
- 18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/Face Detection - Friends Characters.ipynb +3 -526
- 18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/Face Extraction from Video.ipynb +3 -93
- 19. Medical Imaging Segmentation using U-Net/U-Net (not compatible with TensorFlow 2.0, required to downgrade).ipynb +0 -0
- 21. TensforFlow Object Detection/object_detection_tutorial.ipynb +0 -0
- 23. DeepDream & Neural Style Transfers/.~24.1 DeepDream.ipynb +3 -309
- 23. DeepDream & Neural Style Transfers/23.1 DeepDream.ipynb +0 -0
- 23. DeepDream & Neural Style Transfers/24. Neural Style Transfer.ipynb +0 -0
- 24. GANS_Generative_Networks/MNIST DCGAN.ipynb +3 -918
- 25. Face Recognition/.ipynb_checkpoints/25.0 Face Extraction from Video - Build Dataset-checkpoint.ipynb +3 -93
- 25. Face Recognition/.ipynb_checkpoints/25.1 Face Recognition - Friends Characters - Train and Test-checkpoint.ipynb +3 -536
- 25. Face Recognition/.ipynb_checkpoints/25.2 Face Recogition - Matching Faces-checkpoint.ipynb +0 -0
- 25. Face Recognition/.ipynb_checkpoints/25.3 Face Recogition - One Shot Learning-checkpoint.ipynb +3 -406
- 25. Face Recognition/.ipynb_checkpoints/Face Recogition - Matching Faces-checkpoint.ipynb +3 -6
- 25. Face Recognition/.ipynb_checkpoints/Face Recogition - One Shot Learning-checkpoint.ipynb +0 -0
- 25. Face Recognition/25.0 Face Extraction from Video - Build Dataset.ipynb +3 -93
- 25. Face Recognition/25.1 Face Recognition - Friends Characters - Train and Test.ipynb +3 -521
- 25. Face Recognition/25.2 Face Recogition - Matching Faces.ipynb +0 -0
- 25. Face Recognition/25.3 Face Recogition - One Shot Learning.ipynb +3 -406
- 26. Credit Card/26. Credit Card Reader.ipynb +3 -1066
- 4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/.ipynb_checkpoints/4.1 - Handwritten Digit Classification Demo (MNIST)-checkpoint.ipynb +3 -327
- 4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/.ipynb_checkpoints/4.2 - Image Classifier - CIFAR10-checkpoint.ipynb +3 -6
- 4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/.ipynb_checkpoints/4.3. Live Sketching-checkpoint.ipynb +3 -101
- 4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/.ipynb_checkpoints/Test - Imports Keras, OpenCV and tests webcam-checkpoint.ipynb +3 -73
- 4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/4.1 - Handwritten Digit Classification Demo (MNIST).ipynb +3 -656
- 4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/4.2 - Image Classifier - CIFAR10.ipynb +3 -167
- 4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/4.3. Live Sketching.ipynb +3 -126
- 4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/Test - Imports Keras, OpenCV and tests webcam.ipynb +3 -122
- 8. Making a CNN in Keras/8.11 - Building a CNN for Image Classification - CIFAR10.ipynb +3 -379
- 8. Making a CNN in Keras/8.3 to 8.10 - Building a CNN for handwritten digits - MNIST.ipynb +0 -0
- 9. Visualizing What CNNs 'see' & Filter Visualization/9.1 Activation Maximization using Keras Visualization Toolkit.ipynb +0 -0
- 9. Visualizing What CNNs 'see' & Filter Visualization/9.2 Saliency Maps.ipynb +0 -0
- 9. Visualizing What CNNs 'see' & Filter Visualization/9.3A Visualizing Filter Patterns.ipynb +0 -0
- 9. Visualizing What CNNs 'see' & Filter Visualization/9.3B Visualizing Filter Patterns - VGG16.ipynb +0 -0
- 9. Visualizing What CNNs 'see' & Filter Visualization/9.4 Heat Map Visualizations of Class Activation.ipynb +0 -0
.gitattributes
CHANGED
|
@@ -1,3 +1,274 @@
|
|
| 1 |
*.ipynb filter=lfs diff=lfs merge=lfs -text
|
| 2 |
*.h5 filter=lfs diff=lfs merge=lfs -text
|
| 3 |
*.npy filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
*.ipynb filter=lfs diff=lfs merge=lfs -text
|
| 2 |
*.h5 filter=lfs diff=lfs merge=lfs -text
|
| 3 |
*.npy filter=lfs diff=lfs merge=lfs -text
|
| 4 |
+
14.[[:space:]]ImageNet[[:space:]]and[[:space:]]Pretrained[[:space:]]Models[[:space:]]VGG16_ResNet50_InceptionV3/images/dog.jpg filter=lfs diff=lfs merge=lfs -text
|
| 5 |
+
14.[[:space:]]ImageNet[[:space:]]and[[:space:]]Pretrained[[:space:]]Models[[:space:]]VGG16_ResNet50_InceptionV3/images/snail.jpg filter=lfs diff=lfs merge=lfs -text
|
| 6 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_3.png filter=lfs diff=lfs merge=lfs -text
|
| 7 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_5.png filter=lfs diff=lfs merge=lfs -text
|
| 8 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_6.png filter=lfs diff=lfs merge=lfs -text
|
| 9 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_8.png filter=lfs diff=lfs merge=lfs -text
|
| 10 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_0.png filter=lfs diff=lfs merge=lfs -text
|
| 11 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_9.png filter=lfs diff=lfs merge=lfs -text
|
| 12 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_7.png filter=lfs diff=lfs merge=lfs -text
|
| 13 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_10.png filter=lfs diff=lfs merge=lfs -text
|
| 14 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_11.png filter=lfs diff=lfs merge=lfs -text
|
| 15 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_12.png filter=lfs diff=lfs merge=lfs -text
|
| 16 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_1.png filter=lfs diff=lfs merge=lfs -text
|
| 17 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_16.png filter=lfs diff=lfs merge=lfs -text
|
| 18 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_14.png filter=lfs diff=lfs merge=lfs -text
|
| 19 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_15.png filter=lfs diff=lfs merge=lfs -text
|
| 20 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_13.png filter=lfs diff=lfs merge=lfs -text
|
| 21 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_17.png filter=lfs diff=lfs merge=lfs -text
|
| 22 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_18.png filter=lfs diff=lfs merge=lfs -text
|
| 23 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_19.png filter=lfs diff=lfs merge=lfs -text
|
| 24 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_2.png filter=lfs diff=lfs merge=lfs -text
|
| 25 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_5.png filter=lfs diff=lfs merge=lfs -text
|
| 26 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_6.png filter=lfs diff=lfs merge=lfs -text
|
| 27 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_3.png filter=lfs diff=lfs merge=lfs -text
|
| 28 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_4.png filter=lfs diff=lfs merge=lfs -text
|
| 29 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_7.png filter=lfs diff=lfs merge=lfs -text
|
| 30 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_9.png filter=lfs diff=lfs merge=lfs -text
|
| 31 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_8.png filter=lfs diff=lfs merge=lfs -text
|
| 32 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_0.png filter=lfs diff=lfs merge=lfs -text
|
| 33 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_11.png filter=lfs diff=lfs merge=lfs -text
|
| 34 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_12.png filter=lfs diff=lfs merge=lfs -text
|
| 35 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_10.png filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_1.png filter=lfs diff=lfs merge=lfs -text
|
| 37 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_15.png filter=lfs diff=lfs merge=lfs -text
|
| 38 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_14.png filter=lfs diff=lfs merge=lfs -text
|
| 39 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_13.png filter=lfs diff=lfs merge=lfs -text
|
| 40 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_16.png filter=lfs diff=lfs merge=lfs -text
|
| 41 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_2.png filter=lfs diff=lfs merge=lfs -text
|
| 42 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_19.png filter=lfs diff=lfs merge=lfs -text
|
| 43 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_17.png filter=lfs diff=lfs merge=lfs -text
|
| 44 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_18.png filter=lfs diff=lfs merge=lfs -text
|
| 45 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_3.png filter=lfs diff=lfs merge=lfs -text
|
| 46 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_5.png filter=lfs diff=lfs merge=lfs -text
|
| 47 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_4.png filter=lfs diff=lfs merge=lfs -text
|
| 48 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_6.png filter=lfs diff=lfs merge=lfs -text
|
| 49 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_8.png filter=lfs diff=lfs merge=lfs -text
|
| 50 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_7.png filter=lfs diff=lfs merge=lfs -text
|
| 51 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_nid_9.png filter=lfs diff=lfs merge=lfs -text
|
| 52 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_0.png filter=lfs diff=lfs merge=lfs -text
|
| 53 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_10.png filter=lfs diff=lfs merge=lfs -text
|
| 54 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_1.png filter=lfs diff=lfs merge=lfs -text
|
| 55 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_11.png filter=lfs diff=lfs merge=lfs -text
|
| 56 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_14.png filter=lfs diff=lfs merge=lfs -text
|
| 57 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_12.png filter=lfs diff=lfs merge=lfs -text
|
| 58 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_13.png filter=lfs diff=lfs merge=lfs -text
|
| 59 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_18.png filter=lfs diff=lfs merge=lfs -text
|
| 60 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_17.png filter=lfs diff=lfs merge=lfs -text
|
| 61 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_15.png filter=lfs diff=lfs merge=lfs -text
|
| 62 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_16.png filter=lfs diff=lfs merge=lfs -text
|
| 63 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_19.png filter=lfs diff=lfs merge=lfs -text
|
| 64 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_2.png filter=lfs diff=lfs merge=lfs -text
|
| 65 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_4.png filter=lfs diff=lfs merge=lfs -text
|
| 66 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_3.png filter=lfs diff=lfs merge=lfs -text
|
| 67 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_5.png filter=lfs diff=lfs merge=lfs -text
|
| 68 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_7.png filter=lfs diff=lfs merge=lfs -text
|
| 69 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_8.png filter=lfs diff=lfs merge=lfs -text
|
| 70 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_6.png filter=lfs diff=lfs merge=lfs -text
|
| 71 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_r_0.png filter=lfs diff=lfs merge=lfs -text
|
| 72 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_np_9.png filter=lfs diff=lfs merge=lfs -text
|
| 73 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_r_2.png filter=lfs diff=lfs merge=lfs -text
|
| 74 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_r_1.png filter=lfs diff=lfs merge=lfs -text
|
| 75 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_r_5.png filter=lfs diff=lfs merge=lfs -text
|
| 76 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_r_4.png filter=lfs diff=lfs merge=lfs -text
|
| 77 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_r_3.png filter=lfs diff=lfs merge=lfs -text
|
| 78 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_r_6.png filter=lfs diff=lfs merge=lfs -text
|
| 79 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/style_transfer_result_at_iteration_r_7.png filter=lfs diff=lfs merge=lfs -text
|
| 80 |
+
18[[:space:]].[[:space:]]Deep[[:space:]]Survaliance[[:space:]]-[[:space:]]Build[[:space:]]a[[:space:]]Face[[:space:]]Detector[[:space:]]with[[:space:]]Emotion,[[:space:]]Age[[:space:]]and[[:space:]]Gender[[:space:]]Recognition/images/obama.jpg filter=lfs diff=lfs merge=lfs -text
|
| 81 |
+
Parameterized[[:space:]]Learning.png filter=lfs diff=lfs merge=lfs -text
|
| 82 |
+
Screenshot[[:space:]]from[[:space:]]2024-03-10[[:space:]]20-42-24.png filter=lfs diff=lfs merge=lfs -text
|
| 83 |
+
output.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 84 |
+
Screenshot[[:space:]]from[[:space:]]2024-03-10[[:space:]]20-59-52.png filter=lfs diff=lfs merge=lfs -text
|
| 85 |
+
Screenshot[[:space:]]from[[:space:]]2024-03-10[[:space:]]21-00-09.png filter=lfs diff=lfs merge=lfs -text
|
| 86 |
+
Screenshot[[:space:]]from[[:space:]]2024-03-10[[:space:]]21-00-51.png filter=lfs diff=lfs merge=lfs -text
|
| 87 |
+
Screenshot[[:space:]]from[[:space:]]2024-03-10[[:space:]]20-58-26.png filter=lfs diff=lfs merge=lfs -text
|
| 88 |
+
Screenshot[[:space:]]from[[:space:]]2024-08-10[[:space:]]21-00-04.png filter=lfs diff=lfs merge=lfs -text
|
| 89 |
+
Screenshot[[:space:]]from[[:space:]]2024-08-10[[:space:]]20-59-46.png filter=lfs diff=lfs merge=lfs -text
|
| 90 |
+
Screenshot[[:space:]]from[[:space:]]2024-08-10[[:space:]]21-00-56.png filter=lfs diff=lfs merge=lfs -text
|
| 91 |
+
Screenshot[[:space:]]from[[:space:]]2024-08-10[[:space:]]21-01-05.png filter=lfs diff=lfs merge=lfs -text
|
| 92 |
+
Why[[:space:]]AI[[:space:]]is[[:space:]]Harder[[:space:]]Than[[:space:]]We[[:space:]]Think.pdf filter=lfs diff=lfs merge=lfs -text
|
| 93 |
+
What[[:space:]]is[[:space:]]Multimodality.pdf filter=lfs diff=lfs merge=lfs -text
|
| 94 |
+
10.[[:space:]]Data[[:space:]]Augmentation[[:space:]]Cats[[:space:]]vs[[:space:]]Dogs/1.[[:space:]]Data[[:space:]]Augmentation[[:space:]]Chapter[[:space:]]Overview.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 95 |
+
11.[[:space:]]Assessing[[:space:]]Model[[:space:]]Performance/1.[[:space:]]Introduction[[:space:]]to[[:space:]]the[[:space:]]Confusion[[:space:]]Matrix[[:space:]]&[[:space:]]Viewing[[:space:]]Misclassifications.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 96 |
+
10.[[:space:]]Data[[:space:]]Augmentation[[:space:]]Cats[[:space:]]vs[[:space:]]Dogs/4.[[:space:]]Boosting[[:space:]]Accuracy[[:space:]]with[[:space:]]Data[[:space:]]Augmentation.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 97 |
+
10.[[:space:]]Data[[:space:]]Augmentation[[:space:]]Cats[[:space:]]vs[[:space:]]Dogs/3.[[:space:]]Train[[:space:]]a[[:space:]]Cats[[:space:]]vs.[[:space:]]Dogs[[:space:]]Classifier.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 98 |
+
10.[[:space:]]Data[[:space:]]Augmentation[[:space:]]Cats[[:space:]]vs[[:space:]]Dogs/5.[[:space:]]Types[[:space:]]of[[:space:]]Data[[:space:]]Augmentation.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 99 |
+
11.[[:space:]]Confusion[[:space:]]Matrix[[:space:]]and[[:space:]]Viewing[[:space:]]Misclassifications/MNIST_history.pickle filter=lfs diff=lfs merge=lfs -text
|
| 100 |
+
10.[[:space:]]Data[[:space:]]Augmentation[[:space:]]Cats[[:space:]]vs[[:space:]]Dogs/2.[[:space:]]Splitting[[:space:]]Data[[:space:]]into[[:space:]]Test[[:space:]]and[[:space:]]Training[[:space:]]Datasets.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 101 |
+
12.[[:space:]]Optimizers,[[:space:]]Learning[[:space:]]Rates[[:space:]]&[[:space:]]Callbacks[[:space:]]with[[:space:]]Fruit[[:space:]]Classification/1.[[:space:]]Introduction[[:space:]]to[[:space:]]the[[:space:]]types[[:space:]]of[[:space:]]Optimizers,[[:space:]]Learning[[:space:]]Rates[[:space:]]&[[:space:]]Callbacks.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 102 |
+
11.[[:space:]]Assessing[[:space:]]Model[[:space:]]Performance/3.[[:space:]]Finding[[:space:]]and[[:space:]]Viewing[[:space:]]Misclassified[[:space:]]Data.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 103 |
+
11.[[:space:]]Assessing[[:space:]]Model[[:space:]]Performance/2.[[:space:]]Understanding[[:space:]]the[[:space:]]Confusion[[:space:]]Matrix.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 104 |
+
13.[[:space:]]Batch[[:space:]]Normalization[[:space:]]&[[:space:]]LeNet,[[:space:]]AlexNet[[:space:]]Clothing[[:space:]]Classifier/1.[[:space:]]Intro[[:space:]]to[[:space:]]Building[[:space:]]LeNet,[[:space:]]AlexNet[[:space:]]in[[:space:]]Keras[[:space:]]&[[:space:]]Understand[[:space:]]Batch[[:space:]]Normalization.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 105 |
+
12.[[:space:]]Optimizers,[[:space:]]Learning[[:space:]]Rates[[:space:]]&[[:space:]]Callbacks[[:space:]]with[[:space:]]Fruit[[:space:]]Classification/3.[[:space:]]Keras[[:space:]]Callbacks[[:space:]]and[[:space:]]Checkpoint,[[:space:]]Early[[:space:]]Stopping[[:space:]]and[[:space:]]Adjust[[:space:]]Learning[[:space:]]Rates[[:space:]]that[[:space:]]Pl.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 106 |
+
12.[[:space:]]Optimizers,[[:space:]]Learning[[:space:]]Rates[[:space:]]&[[:space:]]Callbacks[[:space:]]with[[:space:]]Fruit[[:space:]]Classification/2.[[:space:]]Types[[:space:]]Optimizers[[:space:]]and[[:space:]]Adaptive[[:space:]]Learning[[:space:]]Rate[[:space:]]Methods.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 107 |
+
13.[[:space:]]Batch[[:space:]]Normalization[[:space:]]&[[:space:]]LeNet,[[:space:]]AlexNet[[:space:]]Clothing[[:space:]]Classifier/2.[[:space:]]Build[[:space:]]LeNet[[:space:]]and[[:space:]]test[[:space:]]on[[:space:]]MNIST.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 108 |
+
13.[[:space:]]Batch[[:space:]]Normalization[[:space:]]&[[:space:]]LeNet,[[:space:]]AlexNet[[:space:]]Clothing[[:space:]]Classifier/4.[[:space:]]Batch[[:space:]]Normalization.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 109 |
+
12.[[:space:]]Optimizers,[[:space:]]Learning[[:space:]]Rates[[:space:]]&[[:space:]]Callbacks[[:space:]]with[[:space:]]Fruit[[:space:]]Classification/4.[[:space:]]Build[[:space:]]a[[:space:]]Fruit[[:space:]]Classifier.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 110 |
+
13.[[:space:]]Batch[[:space:]]Normalization[[:space:]]&[[:space:]]LeNet,[[:space:]]AlexNet[[:space:]]Clothing[[:space:]]Classifier/3.[[:space:]]Build[[:space:]]AlexNet[[:space:]]and[[:space:]]test[[:space:]]on[[:space:]]CIFAR10.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 111 |
+
14.[[:space:]]Advanced[[:space:]]Image[[:space:]]Classiers[[:space:]]-[[:space:]]ImageNet[[:space:]]in[[:space:]]Keras[[:space:]](VGG1619,[[:space:]]InceptionV3,[[:space:]]ResNet50)/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 112 |
+
14.[[:space:]]Advanced[[:space:]]Image[[:space:]]Classiers[[:space:]]-[[:space:]]ImageNet[[:space:]]in[[:space:]]Keras[[:space:]](VGG1619,[[:space:]]InceptionV3,[[:space:]]ResNet50)/4.[[:space:]]Understanding[[:space:]]ResNet50.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 113 |
+
14.[[:space:]]Advanced[[:space:]]Image[[:space:]]Classiers[[:space:]]-[[:space:]]ImageNet[[:space:]]in[[:space:]]Keras[[:space:]](VGG1619,[[:space:]]InceptionV3,[[:space:]]ResNet50)/3.[[:space:]]Understanding[[:space:]]VGG16[[:space:]]and[[:space:]]VGG19.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 114 |
+
15.[[:space:]]Transfer[[:space:]]Learning[[:space:]]Build[[:space:]]a[[:space:]]Flower[[:space:]]&[[:space:]]Monkey[[:space:]]Breed[[:space:]]Classifier/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 115 |
+
14.[[:space:]]Advanced[[:space:]]Image[[:space:]]Classiers[[:space:]]-[[:space:]]ImageNet[[:space:]]in[[:space:]]Keras[[:space:]](VGG1619,[[:space:]]InceptionV3,[[:space:]]ResNet50)/5.[[:space:]]Understanding[[:space:]]InceptionV3.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 116 |
+
14.[[:space:]]Advanced[[:space:]]Image[[:space:]]Classiers[[:space:]]-[[:space:]]ImageNet[[:space:]]in[[:space:]]Keras[[:space:]](VGG1619,[[:space:]]InceptionV3,[[:space:]]ResNet50)/2.[[:space:]]ImageNet[[:space:]]-[[:space:]]Experimenting[[:space:]]with[[:space:]]pre-trained[[:space:]]Models[[:space:]]in[[:space:]]Keras[[:space:]](VGG16,[[:space:]]ResNet50,[[:space:]]Mobi.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 117 |
+
16.[[:space:]]Design[[:space:]]Your[[:space:]]Own[[:space:]]CNN[[:space:]]-[[:space:]]LittleVGG/LittleVGG.png filter=lfs diff=lfs merge=lfs -text
|
| 118 |
+
15.[[:space:]]Transfer[[:space:]]Learning[[:space:]]Build[[:space:]]a[[:space:]]Flower[[:space:]]&[[:space:]]Monkey[[:space:]]Breed[[:space:]]Classifier/2.[[:space:]]What[[:space:]]is[[:space:]]Transfer[[:space:]]Learning[[:space:]]and[[:space:]][[:space:]]Fine[[:space:]]Tuning.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 119 |
+
16.[[:space:]]Design[[:space:]]Your[[:space:]]Own[[:space:]]CNN[[:space:]]-[[:space:]]LittleVGG[[:space:]]A[[:space:]]Simpsons[[:space:]]Classifier/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 120 |
+
16.[[:space:]]Design[[:space:]]Your[[:space:]]Own[[:space:]]CNN[[:space:]]-[[:space:]]LittleVGG[[:space:]]A[[:space:]]Simpsons[[:space:]]Classifier/2.[[:space:]]Introducing[[:space:]]LittleVGG.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 121 |
+
17.[[:space:]]Advanced[[:space:]]Activation[[:space:]]Functions[[:space:]]&[[:space:]]Initializations/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 122 |
+
15.[[:space:]]Transfer[[:space:]]Learning[[:space:]]Build[[:space:]]a[[:space:]]Flower[[:space:]]&[[:space:]]Monkey[[:space:]]Breed[[:space:]]Classifier/4.[[:space:]]Build[[:space:]]a[[:space:]]Flower[[:space:]]Classifier[[:space:]]with[[:space:]]VGG16[[:space:]]using[[:space:]]Transfer[[:space:]]Learning.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 123 |
+
15.[[:space:]]Transfer[[:space:]]Learning[[:space:]]Build[[:space:]]a[[:space:]]Flower[[:space:]]&[[:space:]]Monkey[[:space:]]Breed[[:space:]]Classifier/3.[[:space:]]Build[[:space:]]a[[:space:]]Monkey[[:space:]]Breed[[:space:]]Classifier[[:space:]]with[[:space:]]MobileNet[[:space:]]using[[:space:]]Transfer[[:space:]]Learning.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 124 |
+
17.[[:space:]]Advanced[[:space:]]Activation[[:space:]]Functions[[:space:]]&[[:space:]]Initializations/3.[[:space:]]Advanced[[:space:]]Initializations.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 125 |
+
17.[[:space:]]Advanced[[:space:]]Activation[[:space:]]Functions[[:space:]]&[[:space:]]Initializations/2.[[:space:]]Dying[[:space:]]ReLU[[:space:]]Problem[[:space:]]and[[:space:]]Introduction[[:space:]]to[[:space:]]Leaky[[:space:]]ReLU,[[:space:]]ELU[[:space:]]and[[:space:]]PReLUs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 126 |
+
18.[[:space:]]Facial[[:space:]]Applications[[:space:]]-[[:space:]]Emotion,[[:space:]]Age[[:space:]]&[[:space:]]Gender[[:space:]]Recognition/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 127 |
+
16.[[:space:]]Design[[:space:]]Your[[:space:]]Own[[:space:]]CNN[[:space:]]-[[:space:]]LittleVGG[[:space:]]A[[:space:]]Simpsons[[:space:]]Classifier/3.[[:space:]]Simpsons[[:space:]]Character[[:space:]]Recognition[[:space:]]using[[:space:]]LittleVGG.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 128 |
+
19.[[:space:]]Medical[[:space:]]Imaging[[:space:]]-[[:space:]]Image[[:space:]]Segmentation[[:space:]]with[[:space:]]U-Net/1.[[:space:]]Chapter[[:space:]]Overview[[:space:]]on[[:space:]]Image[[:space:]]Segmentation[[:space:]]&[[:space:]]Medical[[:space:]]Imaging[[:space:]]in[[:space:]]U-Net.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 129 |
+
12.[[:space:]]Optimizers,[[:space:]]Learning[[:space:]]Rates[[:space:]]&[[:space:]]Callbacks[[:space:]]with[[:space:]]Fruit[[:space:]]Classification/4.2[[:space:]]fruits-360.tar.gz filter=lfs diff=lfs merge=lfs -text
|
| 130 |
+
19.[[:space:]]Medical[[:space:]]Imaging[[:space:]]-[[:space:]]Image[[:space:]]Segmentation[[:space:]]with[[:space:]]U-Net/2.[[:space:]]What[[:space:]]is[[:space:]]Segmentation[[:space:]]And[[:space:]]Applications[[:space:]]in[[:space:]]Medical[[:space:]]Imaging.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 131 |
+
19.[[:space:]]Medical[[:space:]]Imaging[[:space:]]-[[:space:]]Image[[:space:]]Segmentation[[:space:]]with[[:space:]]U-Net/3.[[:space:]]U-Net[[:space:]]Image[[:space:]]Segmentation[[:space:]]with[[:space:]]CNNs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 132 |
+
19.[[:space:]]Medical[[:space:]]Imaging[[:space:]]-[[:space:]]Image[[:space:]]Segmentation[[:space:]]with[[:space:]]U-Net/4.[[:space:]]The[[:space:]]Intersection[[:space:]]over[[:space:]]Union[[:space:]](IoU)[[:space:]]Metric.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 133 |
+
20.[[:space:]]Principles[[:space:]]of[[:space:]]Object[[:space:]]Detection/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 134 |
+
20.[[:space:]]Principles[[:space:]]of[[:space:]]Object[[:space:]]Detection/4.[[:space:]]Single[[:space:]]Shot[[:space:]]Detectors[[:space:]](SSDs).mp4 filter=lfs diff=lfs merge=lfs -text
|
| 135 |
+
20.[[:space:]]Principles[[:space:]]of[[:space:]]Object[[:space:]]Detection/2.[[:space:]]Object[[:space:]]Detection[[:space:]]Introduction[[:space:]]-[[:space:]]Sliding[[:space:]]Windows[[:space:]]with[[:space:]]HOGs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 136 |
+
18.[[:space:]]Facial[[:space:]]Applications[[:space:]]-[[:space:]]Emotion,[[:space:]]Age[[:space:]]&[[:space:]]Gender[[:space:]]Recognition/2.[[:space:]]Build[[:space:]]an[[:space:]]Emotion,[[:space:]]Facial[[:space:]]Expression[[:space:]]Detector.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 137 |
+
21.[[:space:]]TensorFlow[[:space:]]Object[[:space:]]Detection[[:space:]]API/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 138 |
+
20.[[:space:]]Principles[[:space:]]of[[:space:]]Object[[:space:]]Detection/5.[[:space:]]YOLO[[:space:]]to[[:space:]]YOLOv3.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 139 |
+
18.[[:space:]]Facial[[:space:]]Applications[[:space:]]-[[:space:]]Emotion,[[:space:]]Age[[:space:]]&[[:space:]]Gender[[:space:]]Recognition/3.[[:space:]]Build[[:space:]]EmotionAgeGender[[:space:]]Recognition[[:space:]]in[[:space:]]our[[:space:]]Deep[[:space:]]Surveillance[[:space:]]Monitor.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 140 |
+
19.[[:space:]]Medical[[:space:]]Imaging[[:space:]]-[[:space:]]Image[[:space:]]Segmentation[[:space:]]with[[:space:]]U-Net/5.[[:space:]]Finding[[:space:]]the[[:space:]]Nuclei[[:space:]]in[[:space:]]Divergent[[:space:]]Images.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 141 |
+
21.[[:space:]]TensorFlow[[:space:]]Object[[:space:]]Detection[[:space:]]API/2.[[:space:]]TFOD[[:space:]]API[[:space:]]Install[[:space:]]and[[:space:]]Setup.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 142 |
+
22.[[:space:]]Object[[:space:]]Detection[[:space:]]with[[:space:]]YOLO[[:space:]]&[[:space:]]Darkflow[[:space:]]Build[[:space:]]a[[:space:]]London[[:space:]]Underground[[:space:]]Sign[[:space:]]Detector/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 143 |
+
20.[[:space:]]Principles[[:space:]]of[[:space:]]Object[[:space:]]Detection/3.[[:space:]]R-CNN,[[:space:]]Fast[[:space:]]R-CNN,[[:space:]]Faster[[:space:]]R-CNN[[:space:]]and[[:space:]]Mask[[:space:]]R-CNN.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 144 |
+
21.[[:space:]]TensorFlow[[:space:]]Object[[:space:]]Detection[[:space:]]API/3.[[:space:]]Experiment[[:space:]]with[[:space:]]a[[:space:]]ResNet[[:space:]]SSD[[:space:]]on[[:space:]]images,[[:space:]]webcam[[:space:]]and[[:space:]]videos.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 145 |
+
21.[[:space:]]TensorFlow[[:space:]]Object[[:space:]]Detection[[:space:]]API/4.[[:space:]]How[[:space:]]to[[:space:]]Train[[:space:]]a[[:space:]]TFOD[[:space:]]Model.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 146 |
+
22.[[:space:]]Object[[:space:]]Detection[[:space:]]with[[:space:]]YOLO[[:space:]]&[[:space:]]Darkflow[[:space:]]Build[[:space:]]a[[:space:]]London[[:space:]]Underground[[:space:]]Sign[[:space:]]Detector/2.[[:space:]]Setting[[:space:]]up[[:space:]]and[[:space:]]install[[:space:]]Yolo[[:space:]]DarkNet[[:space:]]and[[:space:]]DarkFlow.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 147 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers[[:space:]]Make[[:space:]]AI[[:space:]]Generated[[:space:]]Art/1.[[:space:]]Chapter[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 148 |
+
24.[[:space:]]Generative[[:space:]]Adversarial[[:space:]]Networks[[:space:]](GANs)[[:space:]]Simulate[[:space:]]Aging[[:space:]]Faces/1.[[:space:]]Generative[[:space:]]Adverserial[[:space:]]Neural[[:space:]]Networks[[:space:]]Chapter[[:space:]]Overview.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 149 |
+
22.[[:space:]]Object[[:space:]]Detection[[:space:]]with[[:space:]]YOLO[[:space:]]&[[:space:]]Darkflow[[:space:]]Build[[:space:]]a[[:space:]]London[[:space:]]Underground[[:space:]]Sign[[:space:]]Detector/3.[[:space:]]Experiment[[:space:]]with[[:space:]]YOLO[[:space:]]on[[:space:]]still[[:space:]]images,[[:space:]]webcam[[:space:]]and[[:space:]]videos.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 150 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers[[:space:]]Make[[:space:]]AI[[:space:]]Generated[[:space:]]Art/2.[[:space:]]DeepDream[[:space:]]–[[:space:]]How[[:space:]]AI[[:space:]]Generated[[:space:]]Art[[:space:]]All[[:space:]]Started.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 151 |
+
24.[[:space:]]Generative[[:space:]]Adversarial[[:space:]]Networks[[:space:]](GANs)[[:space:]]Simulate[[:space:]]Aging[[:space:]]Faces/3.[[:space:]]Mathematics[[:space:]]of[[:space:]]GANs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 152 |
+
22.[[:space:]]Object[[:space:]]Detection[[:space:]]with[[:space:]]YOLO[[:space:]]&[[:space:]]Darkflow[[:space:]]Build[[:space:]]a[[:space:]]London[[:space:]]Underground[[:space:]]Sign[[:space:]]Detector/4.[[:space:]]Build[[:space:]]your[[:space:]]own[[:space:]]YOLO[[:space:]]Object[[:space:]]Detector[[:space:]]-[[:space:]]Detecting[[:space:]]London[[:space:]]Underground[[:space:]]Signs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 153 |
+
24.[[:space:]]Generative[[:space:]]Adversarial[[:space:]]Networks[[:space:]](GANs)[[:space:]]Simulate[[:space:]]Aging[[:space:]]Faces/2.[[:space:]]Introduction[[:space:]]To[[:space:]]GANs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 154 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers[[:space:]]Make[[:space:]]AI[[:space:]]Generated[[:space:]]Art/3.[[:space:]]Neural[[:space:]]Style[[:space:]]Transfer.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 155 |
+
24.[[:space:]]Generative[[:space:]]Adversarial[[:space:]]Networks[[:space:]](GANs)[[:space:]]Simulate[[:space:]]Aging[[:space:]]Faces/5.[[:space:]]Face[[:space:]]Aging[[:space:]]GAN.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 156 |
+
25.[[:space:]]Face[[:space:]]Recognition/testfriends.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 157 |
+
24.[[:space:]]Generative[[:space:]]Adversarial[[:space:]]Networks[[:space:]](GANs)[[:space:]]Simulate[[:space:]]Aging[[:space:]]Faces/4.[[:space:]]Implementing[[:space:]]GANs[[:space:]]in[[:space:]]Keras.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 158 |
+
26.[[:space:]]The[[:space:]]Computer[[:space:]]Vision[[:space:]]World/4.[[:space:]]Popular[[:space:]]Computer[[:space:]]Vision[[:space:]]Conferences[[:space:]]&[[:space:]]Finding[[:space:]]Datasets.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 159 |
+
26.[[:space:]]The[[:space:]]Computer[[:space:]]Vision[[:space:]]World/Screenshot[[:space:]]from[[:space:]]2024-08-10[[:space:]]20-58-43.png filter=lfs diff=lfs merge=lfs -text
|
| 160 |
+
26.[[:space:]]The[[:space:]]Computer[[:space:]]Vision[[:space:]]World/Screenshot[[:space:]]from[[:space:]]2024-08-10[[:space:]]20-59-06.png filter=lfs diff=lfs merge=lfs -text
|
| 161 |
+
26.[[:space:]]The[[:space:]]Computer[[:space:]]Vision[[:space:]]World/5.[[:space:]]Building[[:space:]]a[[:space:]]Deep[[:space:]]Learning[[:space:]]Machine[[:space:]]vs.[[:space:]]Cloud[[:space:]]GPUs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 162 |
+
28.[[:space:]]BONUS[[:space:]]-[[:space:]]Use[[:space:]]Cloud[[:space:]]GPUs[[:space:]]on[[:space:]]PaperSpace/2.1[[:space:]]AlexNet[[:space:]]CIFAR10.zip filter=lfs diff=lfs merge=lfs -text
|
| 163 |
+
4.[[:space:]]Handwriting[[:space:]]Recognition/2.[[:space:]]Experiment[[:space:]]with[[:space:]]a[[:space:]]Handwriting[[:space:]]Classifier.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 164 |
+
4.[[:space:]]Handwriting[[:space:]]Recognition/3.[[:space:]]Experiment[[:space:]]with[[:space:]]a[[:space:]]Image[[:space:]]Classifier.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 165 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/10.[[:space:]]Transformations,[[:space:]]Affine[[:space:]]And[[:space:]]Non-Affine[[:space:]]-[[:space:]]The[[:space:]]Many[[:space:]]Ways[[:space:]]We[[:space:]]Can[[:space:]]Change[[:space:]]Images.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 166 |
+
4.[[:space:]]Handwriting[[:space:]]Recognition/4.[[:space:]]OpenCV[[:space:]]Demo[[:space:]]–[[:space:]]Live[[:space:]]Sketch[[:space:]]with[[:space:]]Webcam.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 167 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/11.[[:space:]]Image[[:space:]]Translations[[:space:]]-[[:space:]]Moving[[:space:]]Images[[:space:]]Up,[[:space:]]Down.[[:space:]]Left[[:space:]]And[[:space:]]Right.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 168 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/12.[[:space:]]Rotations[[:space:]]-[[:space:]]How[[:space:]]To[[:space:]]Spin[[:space:]]Your[[:space:]]Image[[:space:]]Around[[:space:]]And[[:space:]]Do[[:space:]]Horizontal[[:space:]]Flipping.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 169 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/14.[[:space:]]Image[[:space:]]Pyramids[[:space:]]-[[:space:]]Another[[:space:]]Way[[:space:]]of[[:space:]]Re-Sizing.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 170 |
+
25.[[:space:]]Face[[:space:]]Recognition[[:space:]]with[[:space:]]VGGFace/.goutputstream-XYXP52 filter=lfs diff=lfs merge=lfs -text
|
| 171 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/13.[[:space:]]Scaling,[[:space:]]Re-sizing[[:space:]]and[[:space:]]Interpolations[[:space:]]-[[:space:]]Understand[[:space:]]How[[:space:]]Re-Sizing[[:space:]]Affects[[:space:]]Quality.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 172 |
+
25.[[:space:]]Face[[:space:]]Recognition[[:space:]]with[[:space:]]VGGFace/2.1[[:space:]]vgg_face_weights.h5.tar.gz filter=lfs diff=lfs merge=lfs -text
|
| 173 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/15.[[:space:]]Cropping[[:space:]]-[[:space:]]Cut[[:space:]]Out[[:space:]]The[[:space:]]Image[[:space:]]The[[:space:]]Regions[[:space:]]You[[:space:]]Want[[:space:]]or[[:space:]]Don't[[:space:]]Want.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 174 |
+
25.[[:space:]]Face[[:space:]]Recognition[[:space:]]with[[:space:]]VGGFace/2.2[[:space:]]vgg_face_weights.h5.tar.gz filter=lfs diff=lfs merge=lfs -text
|
| 175 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/16.[[:space:]]Arithmetic[[:space:]]Operations[[:space:]]-[[:space:]]Brightening[[:space:]]and[[:space:]]Darkening[[:space:]]Images.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 176 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/17.[[:space:]]Bitwise[[:space:]]Operations[[:space:]]-[[:space:]]How[[:space:]]Image[[:space:]]Masking[[:space:]]Works.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 177 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/19.[[:space:]]Sharpening[[:space:]]-[[:space:]]Reverse[[:space:]]Your[[:space:]]Images[[:space:]]Blurs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 178 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/2.[[:space:]]What[[:space:]]are[[:space:]]Images.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 179 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/18.[[:space:]]Blurring[[:space:]]-[[:space:]]The[[:space:]]Many[[:space:]]Ways[[:space:]]We[[:space:]]Can[[:space:]]Blur[[:space:]]Images[[:space:]]&[[:space:]]Why[[:space:]]It's[[:space:]]Important.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 180 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/21.[[:space:]]Dilation,[[:space:]]Erosion,[[:space:]]OpeningClosing[[:space:]]-[[:space:]]Importance[[:space:]]of[[:space:]]ThickeningThinning[[:space:]]Lines.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 181 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/20.[[:space:]]Thresholding[[:space:]](Binarization)[[:space:]]-[[:space:]]Making[[:space:]]Certain[[:space:]]Images[[:space:]]Areas[[:space:]]Black[[:space:]]or[[:space:]]White.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 182 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/22.[[:space:]]Edge[[:space:]]Detection[[:space:]]using[[:space:]]Image[[:space:]]Gradients[[:space:]]&[[:space:]]Canny[[:space:]]Edge[[:space:]]Detection.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 183 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/23.[[:space:]]Perspective[[:space:]]&[[:space:]]Affine[[:space:]]Transforms[[:space:]]-[[:space:]]Take[[:space:]]An[[:space:]]Off[[:space:]]Angle[[:space:]]Shot[[:space:]]&[[:space:]]Make[[:space:]]It[[:space:]]Look[[:space:]]Top[[:space:]]Down.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 184 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/24.[[:space:]]Mini[[:space:]]Project[[:space:]]1[[:space:]]-[[:space:]]Live[[:space:]]Sketch[[:space:]]App[[:space:]]-[[:space:]]Turn[[:space:]]your[[:space:]]Webcam[[:space:]]Feed[[:space:]]Into[[:space:]]A[[:space:]]Pencil[[:space:]]Drawing.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 185 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/27.[[:space:]]Approximating[[:space:]]Contours[[:space:]]&[[:space:]]Finding[[:space:]]Their[[:space:]]Convex[[:space:]]Hull[[:space:]]-[[:space:]]Clean[[:space:]]Up[[:space:]]Messy[[:space:]]Contours.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 186 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/25.[[:space:]]Segmentation[[:space:]]and[[:space:]]Contours[[:space:]]-[[:space:]]Extract[[:space:]]Defined[[:space:]]Shapes[[:space:]]In[[:space:]]Your[[:space:]]Image.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 187 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/3.[[:space:]]How[[:space:]]are[[:space:]]Images[[:space:]]Formed.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 188 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/28.[[:space:]]Matching[[:space:]]Contour[[:space:]]Shapes[[:space:]]-[[:space:]]Match[[:space:]]Shapes[[:space:]]In[[:space:]]Images[[:space:]]Even[[:space:]]When[[:space:]]Distorted.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 189 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/29.[[:space:]]Mini[[:space:]]Project[[:space:]]2[[:space:]]-[[:space:]]Identify[[:space:]]Shapes[[:space:]](Square,[[:space:]]Rectangle,[[:space:]]Circle,[[:space:]]Triangle[[:space:]]&[[:space:]]Stars).mp4 filter=lfs diff=lfs merge=lfs -text
|
| 190 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/26.[[:space:]]Sorting[[:space:]]Contours[[:space:]]-[[:space:]]Sort[[:space:]]Those[[:space:]]Shapes[[:space:]]By[[:space:]]Size.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 191 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/32.[[:space:]]Blob[[:space:]]Detection[[:space:]]-[[:space:]]Detect[[:space:]]The[[:space:]]Center[[:space:]]of[[:space:]]Flowers.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 192 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/30.[[:space:]]Line[[:space:]]Detection[[:space:]]-[[:space:]]Detect[[:space:]]Straight[[:space:]]Lines[[:space:]]E.g.[[:space:]]The[[:space:]]Lines[[:space:]]On[[:space:]]A[[:space:]]Sudoku[[:space:]]Game.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 193 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/34.[[:space:]]Object[[:space:]]Detection[[:space:]]Overview.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 194 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/35.[[:space:]]Mini[[:space:]]Project[[:space:]]\#[[:space:]]4[[:space:]]-[[:space:]]Finding[[:space:]]Waldo[[:space:]](Quickly[[:space:]]Find[[:space:]]A[[:space:]]Specific[[:space:]]Pattern[[:space:]]In[[:space:]]An[[:space:]]Image).mp4 filter=lfs diff=lfs merge=lfs -text
|
| 195 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/33.[[:space:]]Mini[[:space:]]Project[[:space:]]3[[:space:]]-[[:space:]]Counting[[:space:]]Circles[[:space:]]and[[:space:]]Ellipses.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 196 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/36.[[:space:]]Feature[[:space:]]Description[[:space:]]Theory[[:space:]]-[[:space:]]How[[:space:]]We[[:space:]]Digitally[[:space:]]Represent[[:space:]]Objects.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 197 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/37.[[:space:]]Finding[[:space:]]Corners[[:space:]]-[[:space:]]Why[[:space:]]Corners[[:space:]]In[[:space:]]Images[[:space:]]Are[[:space:]]Important[[:space:]]to[[:space:]]Object[[:space:]]Detection.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 198 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/39.[[:space:]]HAAR[[:space:]]Cascade[[:space:]]Classifiers[[:space:]]-[[:space:]]Learn[[:space:]]How[[:space:]]Classifiers[[:space:]]Work[[:space:]]And[[:space:]]Why[[:space:]]They're[[:space:]]Amazing.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 199 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/4.[[:space:]]Storing[[:space:]]Images[[:space:]]on[[:space:]]Computers.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 200 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/38.[[:space:]]Histogram[[:space:]]of[[:space:]]Oriented[[:space:]]Gradients[[:space:]]-[[:space:]]Another[[:space:]]Novel[[:space:]]Way[[:space:]]Of[[:space:]]Representing[[:space:]]Images.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 201 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/41.1[[:space:]]Lecture[[:space:]]6.2[[:space:]]and[[:space:]]6.3.tar.gz filter=lfs diff=lfs merge=lfs -text
|
| 202 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/41.[[:space:]]Mini[[:space:]]Project[[:space:]]6[[:space:]]-[[:space:]]Car[[:space:]]and[[:space:]]Pedestrian[[:space:]]Detection[[:space:]]in[[:space:]]Videos.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 203 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/40.[[:space:]]Face[[:space:]]and[[:space:]]Eye[[:space:]]Detection[[:space:]]-[[:space:]]Detect[[:space:]]Human[[:space:]]Faces[[:space:]]and[[:space:]]Eyes[[:space:]]In[[:space:]]Any[[:space:]]Image.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 204 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/6.[[:space:]]Grayscaling[[:space:]]-[[:space:]]Converting[[:space:]]Color[[:space:]]Images[[:space:]]To[[:space:]]Shades[[:space:]]of[[:space:]]Gray.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 205 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/Screenshot_2025-03-13_21-47-57.png filter=lfs diff=lfs merge=lfs -text
|
| 206 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/Screenshot_2025-03-13_21-48-23.png filter=lfs diff=lfs merge=lfs -text
|
| 207 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/1.[[:space:]]Neural[[:space:]]Networks[[:space:]]Chapter[[:space:]]Overview.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 208 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/5.[[:space:]]Getting[[:space:]]Started[[:space:]]with[[:space:]]OpenCV[[:space:]]-[[:space:]]A[[:space:]]Brief[[:space:]]OpenCV[[:space:]]Intro.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 209 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/9.[[:space:]]Creating[[:space:]]Images[[:space:]]&[[:space:]]Drawing[[:space:]]on[[:space:]]Images[[:space:]]-[[:space:]]Make[[:space:]]Squares,[[:space:]]Circles,[[:space:]]Polygons[[:space:]]&[[:space:]]Add[[:space:]]Text.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 210 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/8.[[:space:]]Histogram[[:space:]]representation[[:space:]]of[[:space:]]Images[[:space:]]-[[:space:]]Visualizing[[:space:]]the[[:space:]]Components[[:space:]]of[[:space:]]Images.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 211 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/10.[[:space:]]Epochs,[[:space:]]Iterations[[:space:]]and[[:space:]]Batch[[:space:]]Sizes.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 212 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/12.[[:space:]]Review[[:space:]]and[[:space:]]Best[[:space:]]Practices.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 213 |
+
5.[[:space:]]OpenCV[[:space:]]Tutorial[[:space:]]-[[:space:]]Learn[[:space:]]Classic[[:space:]]Computer[[:space:]]Vision[[:space:]]&[[:space:]]Face[[:space:]]Detection[[:space:]](OPTIONAL)/7.[[:space:]]Understanding[[:space:]]Color[[:space:]]Spaces[[:space:]]-[[:space:]]The[[:space:]]Many[[:space:]]Ways[[:space:]]Color[[:space:]]Images[[:space:]]Are[[:space:]]Stored[[:space:]]Digitally.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 214 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/11.[[:space:]]Measuring[[:space:]]Performance[[:space:]]and[[:space:]]the[[:space:]]Confusion[[:space:]]Matrix.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 215 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/3.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 216 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/2.[[:space:]]Machine[[:space:]]Learning[[:space:]]Overview.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 217 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/4.[[:space:]]Forward[[:space:]]Propagation.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 218 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/1.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]]Chapter[[:space:]]Overview.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 219 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/5.[[:space:]]Activation[[:space:]]Functions.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 220 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/6.[[:space:]]Training[[:space:]]Part[[:space:]]1[[:space:]]–[[:space:]]Loss[[:space:]]Functions.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 221 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/7.[[:space:]]Training[[:space:]]Part[[:space:]]2[[:space:]]–[[:space:]]Backpropagation[[:space:]]and[[:space:]]Gradient[[:space:]]Descent.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 222 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/2.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]]Introduction.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 223 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/5.[[:space:]]ReLU.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 224 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/7.[[:space:]]The[[:space:]]Fully[[:space:]]Connected[[:space:]]Layer.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 225 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/8.[[:space:]]Backpropagation[[:space:]]&[[:space:]]Learning[[:space:]]Rates[[:space:]]–[[:space:]]A[[:space:]]Worked[[:space:]]Example.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 226 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/4.[[:space:]]Depth,[[:space:]]Stride[[:space:]]and[[:space:]]Padding.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 227 |
+
6.[[:space:]]Neural[[:space:]]Networks[[:space:]]Explained/9.[[:space:]]Regularization,[[:space:]]Overfitting,[[:space:]]Generalization[[:space:]]and[[:space:]]Test[[:space:]]Datasets.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 228 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/6.[[:space:]]Pooling.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 229 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/1.[[:space:]]Building[[:space:]]a[[:space:]]CNN[[:space:]]in[[:space:]]Keras.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 230 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/8.[[:space:]]Training[[:space:]]CNNs.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 231 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/9.[[:space:]]Designing[[:space:]]Your[[:space:]]Own[[:space:]]CNN.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 232 |
+
7.[[:space:]]Convolutional[[:space:]]Neural[[:space:]]Networks[[:space:]](CNNs)[[:space:]]Explained/3.[[:space:]]Convolutions[[:space:]]&[[:space:]]Image[[:space:]]Features.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 233 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/10.[[:space:]]Saving[[:space:]]and[[:space:]]Loading[[:space:]]Your[[:space:]]Model.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 234 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/11.[[:space:]]Displaying[[:space:]]Your[[:space:]]Model[[:space:]]Visually.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 235 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/3.[[:space:]]Building[[:space:]]a[[:space:]]Handwriting[[:space:]]Recognition[[:space:]]CNN.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 236 |
+
Module[[:space:]]6[[:space:]]Machine[[:space:]]Learning[[:space:]]Real-World[[:space:]]Case[[:space:]]Studies/Case[[:space:]]study[[:space:]]4Taxi[[:space:]]demand[[:space:]]prediction[[:space:]]in[[:space:]]New[[:space:]]York[[:space:]]City/Notes[[:space:]]and[[:space:]]Resources/yellow_tripdata_2016-01.csv filter=lfs diff=lfs merge=lfs -text
|
| 237 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/5.[[:space:]]Getting[[:space:]]our[[:space:]]data[[:space:]]in[[:space:]]‘Shape’.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 238 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/6.[[:space:]]Hot[[:space:]]One[[:space:]]Encoding.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 239 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/4.[[:space:]]Loading[[:space:]]Our[[:space:]]Data.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 240 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/12.[[:space:]]Building[[:space:]]a[[:space:]]Simple[[:space:]]Image[[:space:]]Classifier[[:space:]]using[[:space:]]CIFAR10.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 241 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/2.[[:space:]]Introduction[[:space:]]to[[:space:]]Keras[[:space:]]&[[:space:]]Tensorflow.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 242 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/7.[[:space:]]Building[[:space:]]&[[:space:]]Compiling[[:space:]]Our[[:space:]]Model.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 243 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/9.[[:space:]]Plotting[[:space:]]Loss[[:space:]]and[[:space:]]Accuracy[[:space:]]Charts.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 244 |
+
8.[[:space:]]Build[[:space:]]CNNs[[:space:]]in[[:space:]]Python[[:space:]]using[[:space:]]Keras/8.[[:space:]]Training[[:space:]]Our[[:space:]]Classifier.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 245 |
+
9.[[:space:]]What[[:space:]]CNNs[[:space:]]'see'[[:space:]]-[[:space:]]Filter[[:space:]]Visualizations,[[:space:]]Heatmaps[[:space:]]and[[:space:]]Salience[[:space:]]Maps/1.[[:space:]]Introduction[[:space:]]to[[:space:]]Visualizing[[:space:]]What[[:space:]]CNNs[[:space:]]'see'[[:space:]]&[[:space:]]Filter[[:space:]]Visualizations.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 246 |
+
Module[[:space:]]6[[:space:]]Machine[[:space:]]Learning[[:space:]]Real-World[[:space:]]Case[[:space:]]Studies/Case[[:space:]]Study[[:space:]]1[[:space:]]Quora[[:space:]]question[[:space:]]Pair[[:space:]]Similarity[[:space:]]Problem/1.[[:space:]]How[[:space:]]to[[:space:]]optimally[[:space:]]learn[[:space:]]from[[:space:]]case-studies[[:space:]]in[[:space:]]the[[:space:]]course.mp3 filter=lfs diff=lfs merge=lfs -text
|
| 247 |
+
9.[[:space:]]What[[:space:]]CNNs[[:space:]]'see'[[:space:]]-[[:space:]]Filter[[:space:]]Visualizations,[[:space:]]Heatmaps[[:space:]]and[[:space:]]Salience[[:space:]]Maps/2.[[:space:]]Saliency[[:space:]]Maps[[:space:]]&[[:space:]]Class[[:space:]]Activation[[:space:]]Maps.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 248 |
+
9.[[:space:]]What[[:space:]]CNNs[[:space:]]'see'[[:space:]]-[[:space:]]Filter[[:space:]]Visualizations,[[:space:]]Heatmaps[[:space:]]and[[:space:]]Salience[[:space:]]Maps/3.[[:space:]]Saliency[[:space:]]Maps[[:space:]]&[[:space:]]Class[[:space:]]Activation[[:space:]]Maps.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 249 |
+
9.[[:space:]]What[[:space:]]CNNs[[:space:]]'see'[[:space:]]-[[:space:]]Filter[[:space:]]Visualizations,[[:space:]]Heatmaps[[:space:]]and[[:space:]]Salience[[:space:]]Maps/4.[[:space:]]Filter[[:space:]]Visualizations.mp4 filter=lfs diff=lfs merge=lfs -text
|
| 250 |
+
Module[[:space:]]6[[:space:]]Machine[[:space:]]Learning[[:space:]]Real-World[[:space:]]Case[[:space:]]Studies/Case[[:space:]]study[[:space:]]4Taxi[[:space:]]demand[[:space:]]prediction[[:space:]]in[[:space:]]New[[:space:]]York[[:space:]]City/Notes[[:space:]]and[[:space:]]Resources/cluster_centers.JPG filter=lfs diff=lfs merge=lfs -text
|
| 251 |
+
Module[[:space:]]6[[:space:]]Machine[[:space:]]Learning[[:space:]]Real-World[[:space:]]Case[[:space:]]Studies/Case[[:space:]]study[[:space:]]4Taxi[[:space:]]demand[[:space:]]prediction[[:space:]]in[[:space:]]New[[:space:]]York[[:space:]]City/Notes[[:space:]]and[[:space:]]Resources/dropoff_outliers.JPG filter=lfs diff=lfs merge=lfs -text
|
| 252 |
+
Module[[:space:]]6[[:space:]]Machine[[:space:]]Learning[[:space:]]Real-World[[:space:]]Case[[:space:]]Studies/Case[[:space:]]study[[:space:]]4Taxi[[:space:]]demand[[:space:]]prediction[[:space:]]in[[:space:]]New[[:space:]]York[[:space:]]City/Notes[[:space:]]and[[:space:]]Resources/install_graphviz.JPG filter=lfs diff=lfs merge=lfs -text
|
| 253 |
+
Module[[:space:]]6[[:space:]]Machine[[:space:]]Learning[[:space:]]Real-World[[:space:]]Case[[:space:]]Studies/Case[[:space:]]study[[:space:]]4Taxi[[:space:]]demand[[:space:]]prediction[[:space:]]in[[:space:]]New[[:space:]]York[[:space:]]City/Notes[[:space:]]and[[:space:]]Resources/mydask.png filter=lfs diff=lfs merge=lfs -text
|
| 254 |
+
Module[[:space:]]6[[:space:]]Machine[[:space:]]Learning[[:space:]]Real-World[[:space:]]Case[[:space:]]Studies/Case[[:space:]]study[[:space:]]4Taxi[[:space:]]demand[[:space:]]prediction[[:space:]]in[[:space:]]New[[:space:]]York[[:space:]]City/Notes[[:space:]]and[[:space:]]Resources/New_york.gif filter=lfs diff=lfs merge=lfs -text
|
| 255 |
+
Module[[:space:]]5[[:space:]]Feature[[:space:]]Engineering[[:space:]]Productionization[[:space:]]and[[:space:]]deployment[[:space:]]of[[:space:]]ML[[:space:]]Models/2.[[:space:]]Miscellaneous[[:space:]]Topics/12.1.[[:space:]]Hands[[:space:]]on[[:space:]]Live[[:space:]]Session_[[:space:]]Build[[:space:]]and[[:space:]]Deploy[[:space:]]an[[:space:]]ML[[:space:]]model[[:space:]]using[[:space:]]AWS[[:space:]]and[[:space:]]APIs__pdf filter=lfs diff=lfs merge=lfs -text
|
| 256 |
+
Module[[:space:]]5[[:space:]]Feature[[:space:]]Engineering[[:space:]]Productionization[[:space:]]and[[:space:]]deployment[[:space:]]of[[:space:]]ML[[:space:]]Models/2.[[:space:]]Miscellaneous[[:space:]]Topics/15.1.[[:space:]]ML[[:space:]]deployment[[:space:]]for[[:space:]]Heroku.pdf filter=lfs diff=lfs merge=lfs -text
|
| 257 |
+
Module[[:space:]]5[[:space:]]Feature[[:space:]]Engineering[[:space:]]Productionization[[:space:]]and[[:space:]]deployment[[:space:]]of[[:space:]]ML[[:space:]]Models/3.[[:space:]]Module[[:space:]]5[[:space:]]Live[[:space:]]Sessions/3.1.[[:space:]]LIVE_InterviewQues_Productionization.pdf filter=lfs diff=lfs merge=lfs -text
|
| 258 |
+
4.[[:space:]]Get[[:space:]]Started![[:space:]]Handwritting[[:space:]]Recognition,[[:space:]]Simple[[:space:]]Object[[:space:]]Classification[[:space:]]&[[:space:]]OpenCV[[:space:]]Demo/images/numbers.jpg filter=lfs diff=lfs merge=lfs -text
|
| 259 |
+
24.[[:space:]]GANS_Generative_Networks/Face-Aging-with-Identity-Preserved-Conditional-Generative-Adversarial-Networks-master/checkpoints/acgan.model-399999.meta filter=lfs diff=lfs merge=lfs -text
|
| 260 |
+
24.[[:space:]]GANS_Generative_Networks/Face-Aging-with-Identity-Preserved-Conditional-Generative-Adversarial-Networks-master/images/method_comp.JPG filter=lfs diff=lfs merge=lfs -text
|
| 261 |
+
24.[[:space:]]GANS_Generative_Networks/Face-Aging-with-Identity-Preserved-Conditional-Generative-Adversarial-Networks-master/checkpoints/acgan.model-399999.data-00000-of-00001 filter=lfs diff=lfs merge=lfs -text
|
| 262 |
+
24.[[:space:]]GANS_Generative_Networks/Face-Aging-with-Identity-Preserved-Conditional-Generative-Adversarial-Networks-master/checkpoints/0_conv5_lsgan_transfer_g75_0.5f-4_a30/acgan.model-399999.meta filter=lfs diff=lfs merge=lfs -text
|
| 263 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/deepdream_outputs/dream_at_scale_(353,[[:space:]]529).png filter=lfs diff=lfs merge=lfs -text
|
| 264 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/deepdream_outputs/dream_at_scale_(494,[[:space:]]740).png filter=lfs diff=lfs merge=lfs -text
|
| 265 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/deepdream_outputs/dream_at_scale_(692,[[:space:]]1037).png filter=lfs diff=lfs merge=lfs -text
|
| 266 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/images/aurora_norway.jpg filter=lfs diff=lfs merge=lfs -text
|
| 267 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/deepdream_outputs/final_dream.png filter=lfs diff=lfs merge=lfs -text
|
| 268 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/images/monetwaterlilies.jpg filter=lfs diff=lfs merge=lfs -text
|
| 269 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/images/t-rex.jpg filter=lfs diff=lfs merge=lfs -text
|
| 270 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/images/wolfpuppy.jpg filter=lfs diff=lfs merge=lfs -text
|
| 271 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_2.png filter=lfs diff=lfs merge=lfs -text
|
| 272 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_1.png filter=lfs diff=lfs merge=lfs -text
|
| 273 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_0.png filter=lfs diff=lfs merge=lfs -text
|
| 274 |
+
23.[[:space:]]DeepDream[[:space:]]&[[:space:]]Neural[[:space:]]Style[[:space:]]Transfers/style_transfer_results/starrynight_onto_eiffel_at_iteration_4.png filter=lfs diff=lfs merge=lfs -text
|
10. Data Augmentation/10.1 - 10.3 - Data Augmentation - Cats vs Dogs.ipynb
CHANGED
|
@@ -1,802 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Cats vs Dogs\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"### Loading our images \n",
|
| 10 |
-
"- Images are labeled catxxx.jpg and dogxxx.jpg"
|
| 11 |
-
]
|
| 12 |
-
},
|
| 13 |
-
{
|
| 14 |
-
"cell_type": "code",
|
| 15 |
-
"execution_count": 1,
|
| 16 |
-
"metadata": {},
|
| 17 |
-
"outputs": [
|
| 18 |
-
{
|
| 19 |
-
"name": "stdout",
|
| 20 |
-
"output_type": "stream",
|
| 21 |
-
"text": [
|
| 22 |
-
"3002 images loaded\n"
|
| 23 |
-
]
|
| 24 |
-
}
|
| 25 |
-
],
|
| 26 |
-
"source": [
|
| 27 |
-
"# Get filenames in list\n",
|
| 28 |
-
"from os import listdir\n",
|
| 29 |
-
"from os.path import isfile, join\n",
|
| 30 |
-
"\n",
|
| 31 |
-
"mypath = \"./datasets/catsvsdogs/images/\"\n",
|
| 32 |
-
"\n",
|
| 33 |
-
"file_names = [f for f in listdir(mypath) if isfile(join(mypath, f))]\n",
|
| 34 |
-
"\n",
|
| 35 |
-
"print(str(len(file_names)) + ' images loaded')"
|
| 36 |
-
]
|
| 37 |
-
},
|
| 38 |
-
{
|
| 39 |
-
"cell_type": "markdown",
|
| 40 |
-
"metadata": {},
|
| 41 |
-
"source": [
|
| 42 |
-
"### Splitting our loaded images into a training and test/validation dataset\n",
|
| 43 |
-
"- We also need to store their labels (i.e. y_train and y_test)\n",
|
| 44 |
-
"- We re-size our images here to maintain a constant dimension of 150 x 150\n",
|
| 45 |
-
"- We're going to use 1000 images of dogs and 1000 images of cats as our training data\n",
|
| 46 |
-
"- For our test/validation dataset we're going to use 500 of each class\n",
|
| 47 |
-
"- Dogs will be labels 1 and cats 0\n",
|
| 48 |
-
"- We store our new images in the following directories\n",
|
| 49 |
-
" - /datasets/catsvsdogs/train/dogs\n",
|
| 50 |
-
" - /datasets/catsvsdogs/train/cats\n",
|
| 51 |
-
" - /datasets/catsvsdogs/validation/dogs\n",
|
| 52 |
-
" - /datasets/catsvsdogs/validation/cats"
|
| 53 |
-
]
|
| 54 |
-
},
|
| 55 |
-
{
|
| 56 |
-
"cell_type": "code",
|
| 57 |
-
"execution_count": 2,
|
| 58 |
-
"metadata": {},
|
| 59 |
-
"outputs": [
|
| 60 |
-
{
|
| 61 |
-
"name": "stdout",
|
| 62 |
-
"output_type": "stream",
|
| 63 |
-
"text": [
|
| 64 |
-
"Training and Test Data Extraction Complete\n"
|
| 65 |
-
]
|
| 66 |
-
}
|
| 67 |
-
],
|
| 68 |
-
"source": [
|
| 69 |
-
"import cv2\n",
|
| 70 |
-
"import numpy as np\n",
|
| 71 |
-
"import sys\n",
|
| 72 |
-
"import os\n",
|
| 73 |
-
"import shutil\n",
|
| 74 |
-
"\n",
|
| 75 |
-
"# Extract 1000 for our training data and 500 for our validation set\n",
|
| 76 |
-
"# Takes about ~20 seconds to run\n",
|
| 77 |
-
"dog_count = 0\n",
|
| 78 |
-
"cat_count = 0\n",
|
| 79 |
-
"training_size = 1000\n",
|
| 80 |
-
"test_size = 500\n",
|
| 81 |
-
"training_images = []\n",
|
| 82 |
-
"training_labels = []\n",
|
| 83 |
-
"test_images = []\n",
|
| 84 |
-
"test_labels = []\n",
|
| 85 |
-
"size = 150\n",
|
| 86 |
-
"dog_dir_train = \"./datasets/catsvsdogs/train/dogs/\"\n",
|
| 87 |
-
"cat_dir_train = \"./datasets/catsvsdogs/train/cats/\"\n",
|
| 88 |
-
"dog_dir_val = \"./datasets/catsvsdogs/validation/dogs/\"\n",
|
| 89 |
-
"cat_dir_val = \"./datasets/catsvsdogs/validation/cats/\"\n",
|
| 90 |
-
"\n",
|
| 91 |
-
"def make_dir(directory):\n",
|
| 92 |
-
" if os.path.exists(directory):\n",
|
| 93 |
-
" shutil.rmtree(directory)\n",
|
| 94 |
-
" os.makedirs(directory)\n",
|
| 95 |
-
"\n",
|
| 96 |
-
"make_dir(dog_dir_train)\n",
|
| 97 |
-
"make_dir(cat_dir_train)\n",
|
| 98 |
-
"make_dir(dog_dir_val)\n",
|
| 99 |
-
"make_dir(cat_dir_val)\n",
|
| 100 |
-
"\n",
|
| 101 |
-
"def getZeros(number):\n",
|
| 102 |
-
" if(number > 10 and number < 100):\n",
|
| 103 |
-
" return \"0\"\n",
|
| 104 |
-
" if(number < 10):\n",
|
| 105 |
-
" return \"00\"\n",
|
| 106 |
-
" else:\n",
|
| 107 |
-
" return \"\"\n",
|
| 108 |
-
"\n",
|
| 109 |
-
"for i, file in enumerate(file_names):\n",
|
| 110 |
-
" \n",
|
| 111 |
-
" if file_names[i][0] == \"d\":\n",
|
| 112 |
-
" dog_count += 1\n",
|
| 113 |
-
" image = cv2.imread(mypath+file)\n",
|
| 114 |
-
" image = cv2.resize(image, (size, size), interpolation = cv2.INTER_AREA)\n",
|
| 115 |
-
" if dog_count <= training_size:\n",
|
| 116 |
-
" training_images.append(image)\n",
|
| 117 |
-
" training_labels.append(1)\n",
|
| 118 |
-
" zeros = getZeros(dog_count)\n",
|
| 119 |
-
" cv2.imwrite(dog_dir_train + \"dog\" + str(zeros) + str(dog_count) + \".jpg\", image)\n",
|
| 120 |
-
" if dog_count > training_size and dog_count <= training_size+test_size:\n",
|
| 121 |
-
" test_images.append(image)\n",
|
| 122 |
-
" test_labels.append(1)\n",
|
| 123 |
-
" zeros = getZeros(dog_count-1000)\n",
|
| 124 |
-
" cv2.imwrite(dog_dir_val + \"dog\" + str(zeros) + str(dog_count-1000) + \".jpg\", image)\n",
|
| 125 |
-
" \n",
|
| 126 |
-
" if file_names[i][0] == \"c\":\n",
|
| 127 |
-
" cat_count += 1\n",
|
| 128 |
-
" image = cv2.imread(mypath+file)\n",
|
| 129 |
-
" image = cv2.resize(image, (size, size), interpolation = cv2.INTER_AREA)\n",
|
| 130 |
-
" if cat_count <= training_size:\n",
|
| 131 |
-
" training_images.append(image)\n",
|
| 132 |
-
" training_labels.append(0)\n",
|
| 133 |
-
" zeros = getZeros(cat_count)\n",
|
| 134 |
-
" cv2.imwrite(cat_dir_train + \"cat\" + str(zeros) + str(cat_count) + \".jpg\", image)\n",
|
| 135 |
-
" if cat_count > training_size and cat_count <= training_size+test_size:\n",
|
| 136 |
-
" test_images.append(image)\n",
|
| 137 |
-
" test_labels.append(0)\n",
|
| 138 |
-
" zeros = getZeros(cat_count-1000)\n",
|
| 139 |
-
" cv2.imwrite(cat_dir_val + \"cat\" + str(zeros) + str(cat_count-1000) + \".jpg\", image)\n",
|
| 140 |
-
"\n",
|
| 141 |
-
" if dog_count == training_size+test_size and cat_count == training_size+test_size:\n",
|
| 142 |
-
" break\n",
|
| 143 |
-
"\n",
|
| 144 |
-
"print(\"Training and Test Data Extraction Complete\")"
|
| 145 |
-
]
|
| 146 |
-
},
|
| 147 |
-
{
|
| 148 |
-
"cell_type": "markdown",
|
| 149 |
-
"metadata": {},
|
| 150 |
-
"source": [
|
| 151 |
-
"### Let's save our dataset's to NPZ files"
|
| 152 |
-
]
|
| 153 |
-
},
|
| 154 |
-
{
|
| 155 |
-
"cell_type": "code",
|
| 156 |
-
"execution_count": 4,
|
| 157 |
-
"metadata": {},
|
| 158 |
-
"outputs": [],
|
| 159 |
-
"source": [
|
| 160 |
-
"# Using numpy's savez function to store our loaded data as NPZ files\n",
|
| 161 |
-
"np.savez('cats_vs_dogs_training_data.npz', np.array(training_images))\n",
|
| 162 |
-
"np.savez('cats_vs_dogs_training_labels.npz', np.array(training_labels))\n",
|
| 163 |
-
"np.savez('cats_vs_dogs_test_data.npz', np.array(test_images))\n",
|
| 164 |
-
"np.savez('cats_vs_dogs_test_labels.npz', np.array(test_labels))"
|
| 165 |
-
]
|
| 166 |
-
},
|
| 167 |
-
{
|
| 168 |
-
"cell_type": "code",
|
| 169 |
-
"execution_count": 3,
|
| 170 |
-
"metadata": {},
|
| 171 |
-
"outputs": [],
|
| 172 |
-
"source": [
|
| 173 |
-
"# Loader Function\n",
|
| 174 |
-
"import numpy as np\n",
|
| 175 |
-
"\n",
|
| 176 |
-
"def load_data_training_and_test(datasetname):\n",
|
| 177 |
-
" \n",
|
| 178 |
-
" npzfile = np.load(datasetname + \"_training_data.npz\")\n",
|
| 179 |
-
" train = npzfile['arr_0']\n",
|
| 180 |
-
" \n",
|
| 181 |
-
" npzfile = np.load(datasetname + \"_training_labels.npz\")\n",
|
| 182 |
-
" train_labels = npzfile['arr_0']\n",
|
| 183 |
-
" \n",
|
| 184 |
-
" npzfile = np.load(datasetname + \"_test_data.npz\")\n",
|
| 185 |
-
" test = npzfile['arr_0']\n",
|
| 186 |
-
" \n",
|
| 187 |
-
" npzfile = np.load(datasetname + \"_test_labels.npz\")\n",
|
| 188 |
-
" test_labels = npzfile['arr_0']\n",
|
| 189 |
-
"\n",
|
| 190 |
-
" return (train, train_labels), (test, test_labels)"
|
| 191 |
-
]
|
| 192 |
-
},
|
| 193 |
-
{
|
| 194 |
-
"cell_type": "markdown",
|
| 195 |
-
"metadata": {},
|
| 196 |
-
"source": [
|
| 197 |
-
"### Let's view some of our loaded images"
|
| 198 |
-
]
|
| 199 |
-
},
|
| 200 |
-
{
|
| 201 |
-
"cell_type": "code",
|
| 202 |
-
"execution_count": 5,
|
| 203 |
-
"metadata": {
|
| 204 |
-
"scrolled": true
|
| 205 |
-
},
|
| 206 |
-
"outputs": [
|
| 207 |
-
{
|
| 208 |
-
"name": "stdout",
|
| 209 |
-
"output_type": "stream",
|
| 210 |
-
"text": [
|
| 211 |
-
"1 - Cat\n",
|
| 212 |
-
"2 - Cat\n",
|
| 213 |
-
"3 - Dog\n",
|
| 214 |
-
"4 - Cat\n",
|
| 215 |
-
"5 - Cat\n",
|
| 216 |
-
"6 - Dog\n",
|
| 217 |
-
"7 - Cat\n",
|
| 218 |
-
"8 - Cat\n",
|
| 219 |
-
"9 - Dog\n",
|
| 220 |
-
"10 - Dog\n"
|
| 221 |
-
]
|
| 222 |
-
}
|
| 223 |
-
],
|
| 224 |
-
"source": [
|
| 225 |
-
"for i in range(1,11):\n",
|
| 226 |
-
" random = np.random.randint(0, len(training_images))\n",
|
| 227 |
-
" cv2.imshow(\"image_\"+str(i), training_images[random])\n",
|
| 228 |
-
" if training_labels[random] == 0:\n",
|
| 229 |
-
" print(str(i) + \" - Cat\")\n",
|
| 230 |
-
" else:\n",
|
| 231 |
-
" print(str(i)+ \" - Dog\")\n",
|
| 232 |
-
" cv2.waitKey(0)\n",
|
| 233 |
-
" \n",
|
| 234 |
-
"cv2.destroyAllWindows()"
|
| 235 |
-
]
|
| 236 |
-
},
|
| 237 |
-
{
|
| 238 |
-
"cell_type": "markdown",
|
| 239 |
-
"metadata": {},
|
| 240 |
-
"source": [
|
| 241 |
-
"### Let's get our data ready in the format expected by Keras\n",
|
| 242 |
-
"- We also stick the previous naming convention "
|
| 243 |
-
]
|
| 244 |
-
},
|
| 245 |
-
{
|
| 246 |
-
"cell_type": "code",
|
| 247 |
-
"execution_count": 6,
|
| 248 |
-
"metadata": {},
|
| 249 |
-
"outputs": [
|
| 250 |
-
{
|
| 251 |
-
"name": "stdout",
|
| 252 |
-
"output_type": "stream",
|
| 253 |
-
"text": [
|
| 254 |
-
"(2000, 150, 150, 3)\n",
|
| 255 |
-
"(2000, 1)\n",
|
| 256 |
-
"(1000, 150, 150, 3)\n",
|
| 257 |
-
"(1000, 1)\n"
|
| 258 |
-
]
|
| 259 |
-
}
|
| 260 |
-
],
|
| 261 |
-
"source": [
|
| 262 |
-
"(x_train, y_train), (x_test, y_test) = load_data_training_and_test(\"cats_vs_dogs\")\n",
|
| 263 |
-
"\n",
|
| 264 |
-
"# Reshaping our label data from (2000,) to (2000,1) and test data from (1000,) to (1000,1)\n",
|
| 265 |
-
"y_train = y_train.reshape(y_train.shape[0], 1)\n",
|
| 266 |
-
"y_test = y_test.reshape(y_test.shape[0], 1)\n",
|
| 267 |
-
"\n",
|
| 268 |
-
"# Change our image type to float32 data type\n",
|
| 269 |
-
"x_train = x_train.astype('float32')\n",
|
| 270 |
-
"x_test = x_test.astype('float32')\n",
|
| 271 |
-
"\n",
|
| 272 |
-
"# Normalize our data by changing the range from (0 to 255) to (0 to 1)\n",
|
| 273 |
-
"x_train /= 255\n",
|
| 274 |
-
"x_test /= 255\n",
|
| 275 |
-
"\n",
|
| 276 |
-
"print(x_train.shape)\n",
|
| 277 |
-
"print(y_train.shape)\n",
|
| 278 |
-
"print(x_test.shape)\n",
|
| 279 |
-
"print(y_test.shape)"
|
| 280 |
-
]
|
| 281 |
-
},
|
| 282 |
-
{
|
| 283 |
-
"cell_type": "markdown",
|
| 284 |
-
"metadata": {},
|
| 285 |
-
"source": [
|
| 286 |
-
"### Let's create our model using a simple CNN that similar to what we used for CIFAR10\n",
|
| 287 |
-
"- Except now we use a Sigmoid instead of Softmax\n",
|
| 288 |
-
"- **Sigmoids are used when we're doing binary (i.e. two class) classification\n",
|
| 289 |
-
"- Note the binary_crossentropy loss"
|
| 290 |
-
]
|
| 291 |
-
},
|
| 292 |
-
{
|
| 293 |
-
"cell_type": "code",
|
| 294 |
-
"execution_count": 8,
|
| 295 |
-
"metadata": {},
|
| 296 |
-
"outputs": [
|
| 297 |
-
{
|
| 298 |
-
"name": "stdout",
|
| 299 |
-
"output_type": "stream",
|
| 300 |
-
"text": [
|
| 301 |
-
"Model: \"sequential_1\"\n",
|
| 302 |
-
"_________________________________________________________________\n",
|
| 303 |
-
"Layer (type) Output Shape Param # \n",
|
| 304 |
-
"=================================================================\n",
|
| 305 |
-
"conv2d_3 (Conv2D) (None, 148, 148, 32) 896 \n",
|
| 306 |
-
"_________________________________________________________________\n",
|
| 307 |
-
"activation_5 (Activation) (None, 148, 148, 32) 0 \n",
|
| 308 |
-
"_________________________________________________________________\n",
|
| 309 |
-
"max_pooling2d_3 (MaxPooling2 (None, 74, 74, 32) 0 \n",
|
| 310 |
-
"_________________________________________________________________\n",
|
| 311 |
-
"conv2d_4 (Conv2D) (None, 72, 72, 32) 9248 \n",
|
| 312 |
-
"_________________________________________________________________\n",
|
| 313 |
-
"activation_6 (Activation) (None, 72, 72, 32) 0 \n",
|
| 314 |
-
"_________________________________________________________________\n",
|
| 315 |
-
"max_pooling2d_4 (MaxPooling2 (None, 36, 36, 32) 0 \n",
|
| 316 |
-
"_________________________________________________________________\n",
|
| 317 |
-
"conv2d_5 (Conv2D) (None, 34, 34, 64) 18496 \n",
|
| 318 |
-
"_________________________________________________________________\n",
|
| 319 |
-
"activation_7 (Activation) (None, 34, 34, 64) 0 \n",
|
| 320 |
-
"_________________________________________________________________\n",
|
| 321 |
-
"max_pooling2d_5 (MaxPooling2 (None, 17, 17, 64) 0 \n",
|
| 322 |
-
"_________________________________________________________________\n",
|
| 323 |
-
"flatten_1 (Flatten) (None, 18496) 0 \n",
|
| 324 |
-
"_________________________________________________________________\n",
|
| 325 |
-
"dense_2 (Dense) (None, 64) 1183808 \n",
|
| 326 |
-
"_________________________________________________________________\n",
|
| 327 |
-
"activation_8 (Activation) (None, 64) 0 \n",
|
| 328 |
-
"_________________________________________________________________\n",
|
| 329 |
-
"dropout_1 (Dropout) (None, 64) 0 \n",
|
| 330 |
-
"_________________________________________________________________\n",
|
| 331 |
-
"dense_3 (Dense) (None, 1) 65 \n",
|
| 332 |
-
"_________________________________________________________________\n",
|
| 333 |
-
"activation_9 (Activation) (None, 1) 0 \n",
|
| 334 |
-
"=================================================================\n",
|
| 335 |
-
"Total params: 1,212,513\n",
|
| 336 |
-
"Trainable params: 1,212,513\n",
|
| 337 |
-
"Non-trainable params: 0\n",
|
| 338 |
-
"_________________________________________________________________\n",
|
| 339 |
-
"None\n"
|
| 340 |
-
]
|
| 341 |
-
}
|
| 342 |
-
],
|
| 343 |
-
"source": [
|
| 344 |
-
"from __future__ import print_function\n",
|
| 345 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 346 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 347 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten\n",
|
| 348 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
|
| 349 |
-
"import os\n",
|
| 350 |
-
"\n",
|
| 351 |
-
"batch_size = 16\n",
|
| 352 |
-
"epochs = 10\n",
|
| 353 |
-
"\n",
|
| 354 |
-
"img_rows = x_train[0].shape[0]\n",
|
| 355 |
-
"img_cols = x_train[1].shape[0]\n",
|
| 356 |
-
"input_shape = (img_rows, img_cols, 3)\n",
|
| 357 |
-
"\n",
|
| 358 |
-
"model = Sequential()\n",
|
| 359 |
-
"model.add(Conv2D(32, (3, 3), input_shape=input_shape))\n",
|
| 360 |
-
"model.add(Activation('relu'))\n",
|
| 361 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 362 |
-
"\n",
|
| 363 |
-
"model.add(Conv2D(32, (3, 3)))\n",
|
| 364 |
-
"model.add(Activation('relu'))\n",
|
| 365 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 366 |
-
"\n",
|
| 367 |
-
"model.add(Conv2D(64, (3, 3)))\n",
|
| 368 |
-
"model.add(Activation('relu'))\n",
|
| 369 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 370 |
-
"\n",
|
| 371 |
-
"model.add(Flatten())\n",
|
| 372 |
-
"model.add(Dense(64))\n",
|
| 373 |
-
"model.add(Activation('relu'))\n",
|
| 374 |
-
"model.add(Dropout(0.5))\n",
|
| 375 |
-
"model.add(Dense(1))\n",
|
| 376 |
-
"model.add(Activation('sigmoid'))\n",
|
| 377 |
-
"\n",
|
| 378 |
-
"model.compile(loss='binary_crossentropy',\n",
|
| 379 |
-
" optimizer='rmsprop',\n",
|
| 380 |
-
" metrics=['accuracy'])\n",
|
| 381 |
-
"\n",
|
| 382 |
-
"print(model.summary())"
|
| 383 |
-
]
|
| 384 |
-
},
|
| 385 |
-
{
|
| 386 |
-
"cell_type": "markdown",
|
| 387 |
-
"metadata": {},
|
| 388 |
-
"source": [
|
| 389 |
-
"### Training our model"
|
| 390 |
-
]
|
| 391 |
-
},
|
| 392 |
-
{
|
| 393 |
-
"cell_type": "code",
|
| 394 |
-
"execution_count": 9,
|
| 395 |
-
"metadata": {},
|
| 396 |
-
"outputs": [
|
| 397 |
-
{
|
| 398 |
-
"name": "stdout",
|
| 399 |
-
"output_type": "stream",
|
| 400 |
-
"text": [
|
| 401 |
-
"Train on 2000 samples, validate on 1000 samples\n",
|
| 402 |
-
"2000/2000 [==============================] - 42s 21ms/sample - loss: 0.7029 - accuracy: 0.5335 - val_loss: 0.6820 - val_accuracy: 0.5440\n",
|
| 403 |
-
"1000/1000 [==============================] - 5s 5ms/sample - loss: 0.6820 - accuracy: 0.5440\n",
|
| 404 |
-
"Test loss: 0.6819891605377197\n",
|
| 405 |
-
"Test accuracy: 0.544\n"
|
| 406 |
-
]
|
| 407 |
-
}
|
| 408 |
-
],
|
| 409 |
-
"source": [
|
| 410 |
-
"history = model.fit(x_train, y_train,\n",
|
| 411 |
-
" batch_size=batch_size,\n",
|
| 412 |
-
" epochs=epochs,\n",
|
| 413 |
-
" validation_data=(x_test, y_test),\n",
|
| 414 |
-
" shuffle=True)\n",
|
| 415 |
-
"\n",
|
| 416 |
-
"model.save(\"cats_vs_dogs_V1.h5\")\n",
|
| 417 |
-
"\n",
|
| 418 |
-
"# Evaluate the performance of our trained model\n",
|
| 419 |
-
"scores = model.evaluate(x_test, y_test, verbose=1)\n",
|
| 420 |
-
"print('Test loss:', scores[0])\n",
|
| 421 |
-
"print('Test accuracy:', scores[1])"
|
| 422 |
-
]
|
| 423 |
-
},
|
| 424 |
-
{
|
| 425 |
-
"cell_type": "markdown",
|
| 426 |
-
"metadata": {},
|
| 427 |
-
"source": [
|
| 428 |
-
"### Testing our Classifier"
|
| 429 |
-
]
|
| 430 |
-
},
|
| 431 |
-
{
|
| 432 |
-
"cell_type": "code",
|
| 433 |
-
"execution_count": 10,
|
| 434 |
-
"metadata": {},
|
| 435 |
-
"outputs": [],
|
| 436 |
-
"source": [
|
| 437 |
-
"import cv2\n",
|
| 438 |
-
"import numpy as np\n",
|
| 439 |
-
"from tensorflow.keras.models import load_model\n",
|
| 440 |
-
"\n",
|
| 441 |
-
"classifier = load_model('cats_vs_dogs_V1.h5')\n",
|
| 442 |
-
"\n",
|
| 443 |
-
"def draw_test(name, pred, input_im):\n",
|
| 444 |
-
" BLACK = [0,0,0]\n",
|
| 445 |
-
" if pred == \"[0]\":\n",
|
| 446 |
-
" pred = \"cat\"\n",
|
| 447 |
-
" if pred == \"[1]\":\n",
|
| 448 |
-
" pred = \"dog\"\n",
|
| 449 |
-
" expanded_image = cv2.copyMakeBorder(input_im, 0, 0, 0, imageL.shape[0] ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 450 |
-
" #expanded_image = cv2.cvtColor(expanded_image, cv2.COLOR_GRAY2BGR)\n",
|
| 451 |
-
" cv2.putText(expanded_image, str(pred), (252, 70) , cv2.FONT_HERSHEY_COMPLEX_SMALL,4, (0,255,0), 2)\n",
|
| 452 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 453 |
-
"\n",
|
| 454 |
-
"\n",
|
| 455 |
-
"for i in range(0,10):\n",
|
| 456 |
-
" rand = np.random.randint(0,len(x_test))\n",
|
| 457 |
-
" input_im = x_test[rand]\n",
|
| 458 |
-
"\n",
|
| 459 |
-
" imageL = cv2.resize(input_im, None, fx=2, fy=2, interpolation = cv2.INTER_CUBIC)\n",
|
| 460 |
-
" cv2.imshow(\"Test Image\", imageL)\n",
|
| 461 |
-
"\n",
|
| 462 |
-
" input_im = input_im.reshape(1,150,150,3) \n",
|
| 463 |
-
" \n",
|
| 464 |
-
" ## Get Prediction\n",
|
| 465 |
-
" res = str(classifier.predict_classes(input_im, 1, verbose = 0)[0])\n",
|
| 466 |
-
"\n",
|
| 467 |
-
" draw_test(\"Prediction\", res, imageL) \n",
|
| 468 |
-
" cv2.waitKey(0)\n",
|
| 469 |
-
"\n",
|
| 470 |
-
"cv2.destroyAllWindows()"
|
| 471 |
-
]
|
| 472 |
-
},
|
| 473 |
-
{
|
| 474 |
-
"cell_type": "markdown",
|
| 475 |
-
"metadata": {},
|
| 476 |
-
"source": [
|
| 477 |
-
"### Analysis\n",
|
| 478 |
-
"- Our results aren't bad, but they could be better"
|
| 479 |
-
]
|
| 480 |
-
},
|
| 481 |
-
{
|
| 482 |
-
"cell_type": "markdown",
|
| 483 |
-
"metadata": {},
|
| 484 |
-
"source": [
|
| 485 |
-
"# Now let's train our Cats vs Dogs Classifier using Data Augmentation"
|
| 486 |
-
]
|
| 487 |
-
},
|
| 488 |
-
{
|
| 489 |
-
"cell_type": "code",
|
| 490 |
-
"execution_count": 11,
|
| 491 |
-
"metadata": {},
|
| 492 |
-
"outputs": [
|
| 493 |
-
{
|
| 494 |
-
"name": "stdout",
|
| 495 |
-
"output_type": "stream",
|
| 496 |
-
"text": [
|
| 497 |
-
"Found 2000 images belonging to 2 classes.\n",
|
| 498 |
-
"Found 1000 images belonging to 2 classes.\n"
|
| 499 |
-
]
|
| 500 |
-
}
|
| 501 |
-
],
|
| 502 |
-
"source": [
|
| 503 |
-
"import os\n",
|
| 504 |
-
"import numpy as np\n",
|
| 505 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 506 |
-
"from tensorflow.keras.layers import Activation, Dropout, Flatten, Dense\n",
|
| 507 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 508 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D, ZeroPadding2D\n",
|
| 509 |
-
"from tensorflow.keras import optimizers\n",
|
| 510 |
-
"import scipy\n",
|
| 511 |
-
"import pylab as pl\n",
|
| 512 |
-
"import matplotlib.cm as cm\n",
|
| 513 |
-
"%matplotlib inline\n",
|
| 514 |
-
"\n",
|
| 515 |
-
"input_shape = (150, 150, 3)\n",
|
| 516 |
-
"img_width = 150\n",
|
| 517 |
-
"img_height = 150\n",
|
| 518 |
-
"\n",
|
| 519 |
-
"nb_train_samples = 2000\n",
|
| 520 |
-
"nb_validation_samples = 1000\n",
|
| 521 |
-
"batch_size = 16\n",
|
| 522 |
-
"epochs = 5\n",
|
| 523 |
-
"\n",
|
| 524 |
-
"train_data_dir = './datasets/catsvsdogs/train'\n",
|
| 525 |
-
"validation_data_dir = './datasets/catsvsdogs/validation'\n",
|
| 526 |
-
"\n",
|
| 527 |
-
"# Creating our data generator for our test data\n",
|
| 528 |
-
"validation_datagen = ImageDataGenerator(\n",
|
| 529 |
-
" # used to rescale the pixel values from [0, 255] to [0, 1] interval\n",
|
| 530 |
-
" rescale = 1./255)\n",
|
| 531 |
-
"\n",
|
| 532 |
-
"# Creating our data generator for our training data\n",
|
| 533 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 534 |
-
" rescale = 1./255, # normalize pixel values to [0,1]\n",
|
| 535 |
-
" rotation_range = 30, # randomly applies rotations\n",
|
| 536 |
-
" width_shift_range = 0.3, # randomly applies width shifting\n",
|
| 537 |
-
" height_shift_range = 0.3, # randomly applies height shifting\n",
|
| 538 |
-
" horizontal_flip = True, # randonly flips the image\n",
|
| 539 |
-
" fill_mode = 'nearest') # uses the fill mode nearest to fill gaps created by the above\n",
|
| 540 |
-
"\n",
|
| 541 |
-
"# Specify criteria about our training data, such as the directory, image size, batch size and type \n",
|
| 542 |
-
"# automagically retrieve images and their classes for train and validation sets\n",
|
| 543 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 544 |
-
" train_data_dir,\n",
|
| 545 |
-
" target_size = (img_width, img_height),\n",
|
| 546 |
-
" batch_size = batch_size,\n",
|
| 547 |
-
" class_mode = 'binary',\n",
|
| 548 |
-
" shuffle = True)\n",
|
| 549 |
-
"\n",
|
| 550 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 551 |
-
" validation_data_dir,\n",
|
| 552 |
-
" target_size = (img_width, img_height),\n",
|
| 553 |
-
" batch_size = batch_size,\n",
|
| 554 |
-
" class_mode = 'binary',\n",
|
| 555 |
-
" shuffle = False) "
|
| 556 |
-
]
|
| 557 |
-
},
|
| 558 |
-
{
|
| 559 |
-
"cell_type": "markdown",
|
| 560 |
-
"metadata": {},
|
| 561 |
-
"source": [
|
| 562 |
-
"### Create our model, just like we did previously"
|
| 563 |
-
]
|
| 564 |
-
},
|
| 565 |
-
{
|
| 566 |
-
"cell_type": "code",
|
| 567 |
-
"execution_count": 12,
|
| 568 |
-
"metadata": {},
|
| 569 |
-
"outputs": [
|
| 570 |
-
{
|
| 571 |
-
"name": "stdout",
|
| 572 |
-
"output_type": "stream",
|
| 573 |
-
"text": [
|
| 574 |
-
"Model: \"sequential_2\"\n",
|
| 575 |
-
"_________________________________________________________________\n",
|
| 576 |
-
"Layer (type) Output Shape Param # \n",
|
| 577 |
-
"=================================================================\n",
|
| 578 |
-
"conv2d_6 (Conv2D) (None, 148, 148, 32) 896 \n",
|
| 579 |
-
"_________________________________________________________________\n",
|
| 580 |
-
"activation_10 (Activation) (None, 148, 148, 32) 0 \n",
|
| 581 |
-
"_________________________________________________________________\n",
|
| 582 |
-
"max_pooling2d_6 (MaxPooling2 (None, 74, 74, 32) 0 \n",
|
| 583 |
-
"_________________________________________________________________\n",
|
| 584 |
-
"conv2d_7 (Conv2D) (None, 72, 72, 32) 9248 \n",
|
| 585 |
-
"_________________________________________________________________\n",
|
| 586 |
-
"activation_11 (Activation) (None, 72, 72, 32) 0 \n",
|
| 587 |
-
"_________________________________________________________________\n",
|
| 588 |
-
"max_pooling2d_7 (MaxPooling2 (None, 36, 36, 32) 0 \n",
|
| 589 |
-
"_________________________________________________________________\n",
|
| 590 |
-
"conv2d_8 (Conv2D) (None, 34, 34, 64) 18496 \n",
|
| 591 |
-
"_________________________________________________________________\n",
|
| 592 |
-
"activation_12 (Activation) (None, 34, 34, 64) 0 \n",
|
| 593 |
-
"_________________________________________________________________\n",
|
| 594 |
-
"max_pooling2d_8 (MaxPooling2 (None, 17, 17, 64) 0 \n",
|
| 595 |
-
"_________________________________________________________________\n",
|
| 596 |
-
"flatten_2 (Flatten) (None, 18496) 0 \n",
|
| 597 |
-
"_________________________________________________________________\n",
|
| 598 |
-
"dense_4 (Dense) (None, 64) 1183808 \n",
|
| 599 |
-
"_________________________________________________________________\n",
|
| 600 |
-
"activation_13 (Activation) (None, 64) 0 \n",
|
| 601 |
-
"_________________________________________________________________\n",
|
| 602 |
-
"dropout_2 (Dropout) (None, 64) 0 \n",
|
| 603 |
-
"_________________________________________________________________\n",
|
| 604 |
-
"dense_5 (Dense) (None, 1) 65 \n",
|
| 605 |
-
"_________________________________________________________________\n",
|
| 606 |
-
"activation_14 (Activation) (None, 1) 0 \n",
|
| 607 |
-
"=================================================================\n",
|
| 608 |
-
"Total params: 1,212,513\n",
|
| 609 |
-
"Trainable params: 1,212,513\n",
|
| 610 |
-
"Non-trainable params: 0\n",
|
| 611 |
-
"_________________________________________________________________\n",
|
| 612 |
-
"None\n"
|
| 613 |
-
]
|
| 614 |
-
}
|
| 615 |
-
],
|
| 616 |
-
"source": [
|
| 617 |
-
"# Creating out model\n",
|
| 618 |
-
"model = Sequential()\n",
|
| 619 |
-
"model.add(Conv2D(32, (3, 3), input_shape=input_shape))\n",
|
| 620 |
-
"model.add(Activation('relu'))\n",
|
| 621 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 622 |
-
"\n",
|
| 623 |
-
"model.add(Conv2D(32, (3, 3)))\n",
|
| 624 |
-
"model.add(Activation('relu'))\n",
|
| 625 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 626 |
-
"\n",
|
| 627 |
-
"model.add(Conv2D(64, (3, 3)))\n",
|
| 628 |
-
"model.add(Activation('relu'))\n",
|
| 629 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 630 |
-
"\n",
|
| 631 |
-
"model.add(Flatten())\n",
|
| 632 |
-
"model.add(Dense(64))\n",
|
| 633 |
-
"model.add(Activation('relu'))\n",
|
| 634 |
-
"model.add(Dropout(0.5))\n",
|
| 635 |
-
"model.add(Dense(1))\n",
|
| 636 |
-
"model.add(Activation('sigmoid'))\n",
|
| 637 |
-
"\n",
|
| 638 |
-
"print(model.summary())\n",
|
| 639 |
-
"\n",
|
| 640 |
-
"model.compile(loss='binary_crossentropy',\n",
|
| 641 |
-
" optimizer='rmsprop',\n",
|
| 642 |
-
" metrics=['accuracy'])"
|
| 643 |
-
]
|
| 644 |
-
},
|
| 645 |
-
{
|
| 646 |
-
"cell_type": "code",
|
| 647 |
-
"execution_count": 13,
|
| 648 |
-
"metadata": {},
|
| 649 |
-
"outputs": [
|
| 650 |
-
{
|
| 651 |
-
"name": "stdout",
|
| 652 |
-
"output_type": "stream",
|
| 653 |
-
"text": [
|
| 654 |
-
"WARNING:tensorflow:From <ipython-input-13-03f251166bbb>:6: Model.fit_generator (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version.\n",
|
| 655 |
-
"Instructions for updating:\n",
|
| 656 |
-
"Please use Model.fit, which supports generators.\n",
|
| 657 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 658 |
-
" ...\n",
|
| 659 |
-
" to \n",
|
| 660 |
-
" ['...']\n",
|
| 661 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 662 |
-
" ...\n",
|
| 663 |
-
" to \n",
|
| 664 |
-
" ['...']\n",
|
| 665 |
-
"Train for 125 steps, validate for 62 steps\n",
|
| 666 |
-
"Epoch 1/5\n",
|
| 667 |
-
"125/125 [==============================] - 64s 511ms/step - loss: 0.7178 - accuracy: 0.5190 - val_loss: 0.6823 - val_accuracy: 0.5887\n",
|
| 668 |
-
"Epoch 2/5\n",
|
| 669 |
-
"125/125 [==============================] - 45s 363ms/step - loss: 0.6883 - accuracy: 0.5555 - val_loss: 0.6647 - val_accuracy: 0.5746\n",
|
| 670 |
-
"Epoch 3/5\n",
|
| 671 |
-
"125/125 [==============================] - 46s 370ms/step - loss: 0.6841 - accuracy: 0.5795 - val_loss: 0.6531 - val_accuracy: 0.6411\n",
|
| 672 |
-
"Epoch 4/5\n",
|
| 673 |
-
"125/125 [==============================] - 47s 373ms/step - loss: 0.6642 - accuracy: 0.6160 - val_loss: 0.7110 - val_accuracy: 0.5151\n",
|
| 674 |
-
"Epoch 5/5\n",
|
| 675 |
-
"125/125 [==============================] - 50s 399ms/step - loss: 0.6643 - accuracy: 0.5815 - val_loss: 0.6231 - val_accuracy: 0.6905\n"
|
| 676 |
-
]
|
| 677 |
-
}
|
| 678 |
-
],
|
| 679 |
-
"source": [
|
| 680 |
-
"history = model.fit_generator(\n",
|
| 681 |
-
" train_generator,\n",
|
| 682 |
-
" steps_per_epoch = nb_train_samples // batch_size,\n",
|
| 683 |
-
" epochs = epochs,\n",
|
| 684 |
-
" validation_data = validation_generator,\n",
|
| 685 |
-
" validation_steps = nb_validation_samples // batch_size)"
|
| 686 |
-
]
|
| 687 |
-
},
|
| 688 |
-
{
|
| 689 |
-
"cell_type": "markdown",
|
| 690 |
-
"metadata": {},
|
| 691 |
-
"source": [
|
| 692 |
-
"## Plotting our Loss and Accuracy Graphs"
|
| 693 |
-
]
|
| 694 |
-
},
|
| 695 |
-
{
|
| 696 |
-
"cell_type": "code",
|
| 697 |
-
"execution_count": 14,
|
| 698 |
-
"metadata": {},
|
| 699 |
-
"outputs": [
|
| 700 |
-
{
|
| 701 |
-
"data": {
|
| 702 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYgAAAEGCAYAAAB/+QKOAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOzdd1gU1/7H8fdZekdBioDYCygWsMWKiTUxlptorNckxtSbXkw33ZteruYXU68lmuQmllhixS52LGBBpaOAoPTO+f0xK0EFBWQZynk9zz6yu7MzH1fc787Md84RUkoURVEU5VoGvQMoiqIodZMqEIqiKEq5VIFQFEVRyqUKhKIoilIuVSAURVGUcpnrHaCmuLq6ypYtW1b79dnZ2djZ2dVcoBqiclWNylU1KlfVNMRcBw8evCilbFbuk1LKBnELDAyUtyIkJOSWXm8qKlfVqFxVo3JVTUPMBRyQFXyuqkNMiqIoSrlMWiCEECOEEKeEEGeEELPLef4zIUSY8XZaCHHZ+Hg3IcQeIUS4EOKoEGKiKXMqiqIo1zPZOQghhBkwDxgKxAP7hRCrpJQRV5aRUj5TZvl/Ad2Nd3OA6VLKSCFEc+CgEGK9lPKyqfIqiqIoVzPlHkQv4IyU8pyUsgBYBoy5wfKTgKUAUsrTUspI48+JQDJQ/kkURVEUxSSENNFYTEKIe4ARUsqZxvvTgN5SyifKWdYXCAW8pZTF1zzXC/gv4C+lLLnmuVnALAB3d/fAZcuWVTmnT+zvJLkPIa3QAnt7+yq/3tSysrJUripQuapG5aqahpgrODj4oJQyqNwnKzp7fas34F7guzL3pwFfVbDsS+U9B3gCp4A+N9tetbuY1s2Wct3LDbI7wZRUrqpRuapG5aqa+tjFFA/4lLnvDSRWsOx9GA8vXSGEcATWAK9JKUNNkhCg31Nw5Gcs8y+ZbBOKoij1kSkLxH6gnRCilRDCEq0IrLp2ISFEB6AJsKfMY5bAcmChlPI3E2aEkiIoLqTDyS9MuhlFUZT6xmRdTFLKIiHEE8B6wAz4QUoZLoR4G22X5kqxmAQsM+7qXDEBGAi4CCFmGB+bIaUMq/Gg4cuhIAuXgsMwx+nq5wbNhuCXa3yTiqIo9YFJh9qQUq4F1l7z2BvX3J9TzusWA4tNma1U0AOw+ytKslIwUAKtBsG9P4Ft01rZvKIoSl2lrqRe8zy0vYOw7u+DnRtEbYMFgyEpXO9kiqIoumrcBeLwEkg8BKM+IsOpE8zaCs27w+UY+G4oRKzUO6GiKIpuGneByE7RDidZGkdBdPKC+9dBwEQozIZfp8OW96Ck5IarURRFaYgad4Ho/zS4dbr6MQsbGPcNDHsPhAG2fwi/TIG8DH0yKoqi6KRxF4iKCAG3PQFTfwdrZzi1Fr67A1LP6p1MURSl1qgCcSNthsCsEGjWCS6egm+DIXKT3qkURVFqhSoQN9O0NczcCB3vgrx0+Ple2PUFmGgMK0VRlLpCFYjKsHKACYtg8MsgS2DjG/D7TCjI0TuZoiiKyagCUVkGAwyeDROXgKU9HP8f/DAcLsfpnUxRGpzlkQV6R1BQBaLqOt0FMzdBk1Zw4ah2UV3Mbr1TKUqDsvJsod4RFFSBqB63TvDQFmgdDDkX4b+jYf/3eqdSFEWpUapAVJdtU5jyP+j7hDYi7Jpn4c+noEjtGivKrdgRmaJ3BMVIFYhbYWYOw9+DcQvAzAoO/qTtTWQl651MUeqlfVFpTPt+HwAZeeowk95MOppro9F1Iri2g1+mQlyodl5i4mLw6qF3MkWp8z7beJovNkde93jAnA1X3X/q9nY8M7R9bcVSUAWi5nj1gIdCtPGb4kLhx5Ew+kuteCiKUqFnhrbnmaHt+XJzJJ9uPI2tpRk5BcW0aWbHpmcHIYTQO2KjpQ4x1SQHd/jnnxA4A4ryYPksWP8qFBfpnUxR6rSzKVn8Z8sZAL6ZFmh8LJs951L1jNXoqQJR08wtYfQXcOenYDCHPf+BJfdATpreyRSlTpJS8uryYxQUl3BvoDcD2jUrfW7RnhgdkymqQJhKzwdh+iqwdYVzIfDtEEg+oXcqRalzfjsYT+i5NJraWfLKqL9HVzY3CDZEJHEhPU/HdI2bKhCm1LKfNgmRZ1e4FKWNCHtitd6pFKXOuJiVz3trtC9Ob9zlRxM7SwDGtLFguL8HxSWSpfti9YzYqKkCYWrOPnD/X9D5HijI0uaW2DpXTUKkKMC7qyNIzy1kQDtXxnRrXvr4uHaWTO3jC8DSfbEUFqv/L3pQBaI2WNrCP76DO94CBGz9AH6dBvmZeidTFN1sP53CirBErMwNvDu283XdSn1aN6Wdmz3JmflsCE/SKWXjpgpEbRFCm8Fuym9g5QQnV2vzXqed0zuZotS63IJiXl1xDICn72iPr4vddcsIIZjWV9uLWBQaXZvxFCNVIGpbu6HaOE6u7SHlBCwIhrNb9E6lKLXqi82RxKXl0tHDgZkDWlW43LjuXthamhF6Lo3IJLXHXdtUgdCDa1uYuRnaj4S8y7D4H7D7P2oSIqVROHE+g293nEMI+GB8FyzMKv4YcrC2YFx3LwAWh6qW19qmCoRerB3hvp9h4AvaJEQbXoXlD0Nhrt7JFMVkiksks/84RnGJZHofX7q3aHLT11w5Wf37oQSy89VFp7VJFQg9GQww5DW4979gYQtHf9GG6EhP0DuZopjE4tAYjsRdxsPRmueHd6jUazp5OtKzZROy8otYflj936hNqkDUBf5j4cGN4NwCEg9rg/3FhuqdSlFq1Pn0XD5afwqAOXf742BtUenXTuvbEtAKjFSHYmuNKhB1hUdneGgrtBoI2cnw013a8OGK0kC8uTKcrPwihvm5M6KzR5VeO8LfA1d7S05eyORAzCUTJVSupQpEXWLnAlOXQ+9HoaQQ/nyKdqf/T01CpNR768MvsCEiCXsrc94a41/l11uaG7ivZwtAjc9Um1SBqGvMzGHkXBgzH8ws8UpcB4vGQpaaZUupnzLzCnlzZTgAzw9rj6eTTbXWM6l3CwwC1h0/T0pmfk1GVCqgCkRd1X0K3L+OfMsmELNLOy+RGKZ3KkWpso/Xn+JCRh5dfZxLzyVUh5ezDXd0cqewWPLLfjU+U21QBaIu8w7iYOAn4N0TMuLhhxFw7H96p1KUSjsce4mFoTGYGQRzx3fBzHBrk/9cubL6572xFKnxmUzOpAVCCDFCCHFKCHFGCDG7nOc/E0KEGW+nhRCXyzz3TyFEpPH2T1PmrMsKrFxgxhroPhWKcuH3B2HjG1BSrHc0RbmhwuISXv7jGFLCzAGt6OTpeMvr7NfGlVaudiSm57HlpJr73dRMViCEEGbAPGAk4AdMEkL4lV1GSvmMlLKblLIb8BXwh/G1TYE3gd5AL+BNIcTNr6hpqMyt4O7/wMiPQJjBri/g5wmQq7o5lLrrux1RnLyQiU9TG56+vWbmkjYYBFN6G09WqyurTc6UexC9gDNSynNSygJgGTDmBstPApYafx4ObJRSpkkpLwEbgREmzFr3CQG9Z8H0FWDTFM5sgm9vh5RTeidTlOvEpubwxebTALw3tgs2lmY1tu57A32wtjCwI/Ii51Kyamy9yvWEqS46EULcA4yQUs403p8G9JZSPlHOsr5AKOAtpSwWQjwPWEsp3zU+/zqQK6X8+JrXzQJmAbi7uwcuW7as2nmzsrKwt7ev9utNpbxc1rlJdD7+PvbZ0RSZ2XCi07OkuvbSPVddoHJVjSlySSn55EA+x1OL6eNpxiNdrWs81w/H89keX8RwX3MmdbK6lbg1mksvt5IrODj4oJQyqNwnpZQmuQH3At+VuT8N+KqCZV8q+xzwAvBamfuvA8/daHuBgYHyVoSEhNzS602lwlz5WVL+Ml3KNx2lfNNJyq0fSllSon8unalcVWOKXMsPxUvfl1bLgDnrZUpmXrXWcbNcx+IvS9+XVssub/4lc/KLqrWN6miI/47AAVnB56opDzHFAz5l7nsDiRUsex9/H16q6msbJ0s7uPcnuP0N7X7Iu/DrdMhXu9yKfi7nFPDO6ggAXh3VCVd703y77+zlRDcfZzLyivjziPpoMBVTFoj9QDshRCshhCVaEVh17UJCiA5AE2BPmYfXA8OEEE2MJ6eHGR9TyhICBjwHk38BK0c4sQq+HwaXovVOpjRS7689QWp2Ab1bNeXeIG+TbmuacZTXhaHRanwmEzFZgZBSFgFPoH2wnwB+lVKGCyHeFkLcXWbRScAyWeZfWEqZBryDVmT2A28bH1PK0364Nr+ES1tIDtcuqju3Ve9USiOz52wqvx6Ix9LMwPvju1w3hWhNuzPAkya2FhxPyCAs7vLNX6BUmUmvg5BSrpVStpdStpFSvmd87A0p5aoyy8yRUl53jYSU8gcpZVvj7UdT5mwQmrXXikS7YVr766LxEPq1moRIqRV5hcW8ulybQvTx4La0aWb6E7nWFmZM6KkdiVYtr6ahrqRuSGycYdIy6P8syGL4azaseAwK8/ROpjRw87ee5dzFbNo0s+ORwa1rbbtTevkiBKw+ep60bDWoZU1TBaKhMZjBHW/CPT+AuQ0c+Rl+GgUZ5/VOpjRQkUmZfL31DAAfjA/Ayrzmrnm4mRYutgxu34yCohJ+OxBXa9ttLFSBaKg6/wMe3ABOLSDhICwYBHH79E6lNDAlJZJXlh+jsFgyqZcPvVo1rfUMV8ZnWrw3huISdUi1JqkC0ZB5BsCsEPDtD1lJ8NOdcGiR3qmUBuSXA3Hsj76Eq70Vs0d00iXDoPZu+DS1IS4tl+2n1bD4NUkViIbOzlUbnqPXLCgugFVPwNoXoLhQ72RKPZecmcf7a08A8OZoP5xsKz+FaE0yMwim9Nb2ItTJ6pqlCkRjYGYBoz6Cu78CgwXsWwCLxkF2qt7JlHrs7T8jyMwrYnCHZtwV4KlrlglBPliaGwg5lUxcWo6uWRoSVSAakx7T4f61YO8O0Tu06yXOH9U7lVIPhZxMZvXR89hYmPHOmM4mv+bhZpraWXJXF0+khCV71WRCNUUViMbGpxfM2grNe0B6LPwwHI7/oXcqpR7JKSjitRXHAXh2aHt8mtrqnEgz1Xiy+pf9seQVqvlSaoIqEI2RY3O4fx10nQyFOfC/+2HTW2oSIqVSPtt4moTLufg3d+T+fi31jlOqu48znb0cuZRTyNpjqq27JqgC0VhZWMPY+TD8A20Sop2fwtJJkJeudzKlDjuekM73O6MwCPhgfBfMzerOR4gQonR8JnWyumbUnX9dpfYJAX0fg2l/gE0TiFyvTUJ0MVLvZEodVGScQrREwozbWhHg7ax3pOvc3dULB2tzDsde5niC+rJzq1SBUKD1YHgoBNz8IDUSvh0CpzfonUqpY/67J4ZjCek0d7LmuWE1M4VoTbOxNOPeQOP4THvUXsStUgVC0TRtBQ9uhE53Q36GNuf1jk/UYH8KAAmXc/lkgza97dtjOmNnZa5zoopN7aPNWb3ySALpOep6n1uhCoTyNyt7uPe/EPwqIGHz2/C/B6AgW+9kio6klLyx4jg5BcWM6uLBHX7ueke6odbN7BnQzpW8whL+dyhe7zj1mioQytUMBhj0Ity3FCwdIPwP+H44XFK7643VuuMX2HwyGQcrc94c7a93nEqZajxZvTg0hhI1PlO1qQKhlK/jKJi5CZq2hqRj8G0wRO2AnZ9BZpLe6ZRakp5byJxV4QC8OLIj7o7WOieqnNs7uuHpZE3UxWx2n1UjBlSXKhBKxdw6wkNboM3tkJMKC8fAmc2w63O9kym15MO/TpKcmU+gbxOm9Gqhd5xKMzczMNmYd+GeaF2z1GeqQCg3ZtMEpvwGtz2pTUIUvQP2f4d1jroQqaE7GJPGkr2xmBsE74/rgsGg73AaVTWxlw8WZoJNJ5JIvJyrd5x6qe62Iih1h8EMhr0DHl1g1b+gKI8++x6B46+Coxc4eWu3a3928AQz9StWHxUUadc8ADw8qDUdPBx0TlR1bg7WjOjsyZ9HElm6L5bnhnXQO1K9o/73KpUXMAFc22mdTWnntMNOOalwoYIB/4RBKxLlFY8rP9u6aBfsKXXKgu1nOZ2URUsXW/41pJ3ecaptWh9fY4GI419D2mFprg6aVIUqEErVuLYHgwUnOvyLTnf9C9LjISNe+zM9AdLjICNB+znrgvZzRkLF6zO3rqB4eIGTj3bfyr72/n4KURez+XKLNoXoe+O6YG1Re1OI1rSeLZvQwd2BU0mZ/BV+gbu7Ntc7Ur2iCoRSNWueB+8gkpzvoJODOzi4A4HlL1tUAJmJfxePqwqJsbDkpUPaWe1WEWunv4vFtcXDyQscmoO5pUn+uo2NlJJXlx+joKiE8T286NfWVe9It0QIwdS+vry+4jiL98SoAlFFqkAolXd4CSQe0jqbdu+/+fLmltCkpXarSH5mxcXjyv28dO2WdLyClQhtjgsnb/wKLCF/49/Fw8kbHL3Brpl2jYdyQ78fSmD32VSa2Frw2p1+esepEeO6e/HvdSfZF53GyQsZdPRw1DtSvaEKhFJ52Slw709gaVdz67Ry0Npp3TqW/7yU2nmO9LhriseVnxMg87x2OCvrAm4AKbuvX4+ZpTbMuaN3mb0Q76vvWzvV3N+rHkrLLuC9NREAvHanH03tGsZemb2VOeN7eLFwTwyLQ2N4d2wXvSPVG6pAKJXX/+na36YQ2rzadq7QvHv5yxQXaUUiPZ6I0I34eTn+XTyuFJbcNLgUrd0qYulQTvHw+vvciKOXNkx6Vez8TJt3ox54d00El3IK6dfWhfE9vPSOU6Om9vFl4Z4Ylh9K4KURHXGw1mf+7PpGFQil/jMzB2cfcPYhOSofv/6Dr1+mIMdYMOKvLx5XHi/IhJQT2q0ids0q6Mry0YqJvbvWFnxFVjLs+gKsh9X4X7sm7Yy8yB+HErA0N/Du2C66TyFa09q7O9C7VVP2RqWx/HAC0/u21DtSvaAKhNHyyAIGD9Y7hWIylrZai65rBS2bUkLupTLFI/6an42FJDtFuyUeLn89BnPtpPmVPQ9rZwhbgqO/FzDYVH+7W5JXWMyrK7RrHp66vR2tXGvwEGIdMr1vS/ZGpbFoTwzT+vg2uCJoCqpAGK08W8gXeodQ9CME2DbVbp4B5S9TUgxZSeUUjzL3s1O0ub7TY696afewV8BwEga/Ao6etfAXqryvtkQSk5pDe3d7HhrQWu84JjPM351mDlZEJmexNyqNPq1d9I5U56kCoSiVZTAznuhuDj69yl+mMO/vaz+u7HlEbUNE74BDC7XbFYNmQ/DLtZO9AnGZJXyz5xygTSHakC8kszAzMKlXC77cHMmi0BhVICqh4f42KIoeLKzBpQ20GgjdJmtTumYlc7bVdG0ypitsXbWryIsKdItaUiL5b3g+RSWSqX1aEOjbVLcstWVSLx/MDIL1xy+QnJGnd5w6r9EXiLTsAsbM2wVoc+4qSo0yXlgY5/sPmLhIm7XPpw/kXIR1L8D83hC+XJeZ+5bsi+XM5RLcHKx4cUQFbcYNjKeTDUM7uVNUIlm6L07vOHVeozzE9NnG03yxOfK6x9u+uu6q+0/d3o5nhtbNuXeVeqC8Cwt9esEDf8GptbDxTW0O8N9mgFcQDH0bWvarlWhJGXl8uO4kAHPu9sexEbV9Tu/ry1/hF/h5XwyPBbfBwqzRf0+uUKMsEM8MbV/6wV9QVMIfh+KZbRy5EsDL2YZHBrXm3iAfvSIqDUFFFxYKAR3vhHbD4fBCCPkAEg7AT6Ogwyi4Yw40M+3Io3NWhZOZX0S3ZmaM7Oxh0m3VNX3buNC6mR3nUrLZfCKJEZ3rVtNAXWLS0imEGCGEOCWEOCOEmF3BMhOEEBFCiHAhxM9lHv/Q+NgJIcSXwkQ9aZbmBu4zTizy2cSutHWzJ+FyLq+vDGfghyF8t+McOQVFpti00tD1fxrcOlX8vJk5BD0ATx7Wupss7LQ9i/l9YNWTkGGaOTc2RiSx7vgFbC3NmOZn2ejaPYUQTDNOSbooVE2leyMmKxBCCDNgHjAS8AMmCSH8rlmmHfAy0E9K6Q88bXz8NqAfEAB0BnoCg0yV9Ypx3b3Z8PRAvp7SAz9PR5Iz83l3zQn6/zuEeSFnyMgrNHUEpTGysofBL8FTYRD0ICDg0H/hqx6w5V3Iy6ixTWXlF/HGSm1Mq+eGdcDFpnEeXhnfwxsbCzN2nUnlTHKW3nHqLFP+dvQCzkgpz0kpC4BlwJhrlnkImCelvAQgpUw2Pi4Ba8ASsAIsgFqZCNlgEIzs4smaJ/vzw4wguvk4k5ZdwEfrT9Fv7hY+3XCKS9n6dZ4oDZi9G9z1KTy+FzqNhsIc2P4RfNkd9n0Lxbf+BeWTDac4n55HgLcTM25reeuZ6yknGwvGdteGE1ms9iIqJKSJuieEEPcAI6SUM433pwG9pZRPlFlmBXAabW/BDJgjpfzL+NzHwExAAP+RUr5azjZmAbMA3N3dA5ctW1btvL+EZzHR//p5B6SUnEgrYdXZAk6maV1OVmYwpIUFw1ua42xl2m9gWVlZ2NvXvfkQVK6qqU4ux/QTtDn7E04Z2snkHBtPolpNI6XZbdWaZOlcejHv7MlDCHizrzW+jmYN6v2qqpiMYt7cnYeNOXw+2BYr85u/pw3x/QoODj4opQwq90kppUluwL3Ad2XuTwO+umaZ1cBytD2EVkA84Ay0BdYA9sbbHmDgjbYXGBgob0VISMhNl9kXlSqnf79X+r60Wvq+tFq2f3WtfHPlcZlwKeeWtn2rufSgclVNtXOVlEgZ8aeUXwZK+aajdlswRMroXVVaTWFRsRz5+Xbp+9Jq+e7q8FvPZWK1lWv8/F3S96XV8ue9MZVaviG+X8ABWcHnqim//sYDZduAvIHEcpZZKaUslFJGAaeAdsA4IFRKmSWlzALWAX1MmLVSerZsyn8f6MWqJ/oxzM+d/KISftodzaCPQnj5j6PEpGbrHVFpaISATnfBY6Fw12dg56Z1PP04EpZOgpRTlVrND7uiiDifgZezjWrdLqP0ZPWemCtfWpUyTFkg9gPthBCthBCWwH3AqmuWWQEEAwghXIH2wDkgFhgkhDAXQlignaC+wRCbtSvA25kF04P46+kBjO7anGLjRTfBH2/lmV/COJOcqXdEpaG5quPp5as7nv58CjIvVPjSuLQcPt14GoB3x3XG1rJRdreXa2QXD1zsLIk4n8Gh2Et6x6lzTFYgpJRFwBPAerQP91+llOFCiLeFEFfGHFgPpAohIoAQ4AUpZSrwP+AscAw4AhyRUv5pqqzV1dHDka8mdWfTs4O4N9AbgxAsP5zA0M+289iSg4QnpusdUWlorOxh8GytUFzpeDr4k3Yie8t72gx9ZUgpeW3FcfIKS7grwJPgDm66xK6rrMzNmNhTO9CxaI86WX0tk55hlVKulVK2l1K2kVK+Z3zsDSnlKuPPUkr5rJTST0rZRUq5zPh4sZTyYSllJ+Nzz5oy561q3cyej+7tSsjzg5napwUWBgNrj13gzi938uBP+zmsvpkoNc3B/e+Op453GTuePoQvul3V8fTn0fNsO52Co7U5b4xuGFOI1rTJvVsgBKw9doGLWfl6x6lTGmcTtIn4NLXl3bFd2P5iMA/2b4W1hYHNJ5MZN383U7/bS+i5VHWcU6lZru3gviXwwHrw6a2N8bT2eZjXm+zDv/P2Ku2ah5dHdcLNoYqz4TUS3k1sub2jGwXFJfx6QI3PVJYqECbg4WTN63f5sfOlITw2uA32VubsPHOR+xaEMuGbPWw7naIKhVKzWvTRisTExeDSFtLOYrfyARYUvsy05glMVMPG3NBU48nqJaGxFJeo/5tXqAJhQq722iiZu14awjN3tMfJxoL90Zf45w/7GDNvFxvCL1CifhmVmiKEdoHdY6FE9X6HFOlID8MZ3kl7AcMvUyrd8dQYDWzXDF8XWxIu5xJyMvnmL2gkVIGoBU62Fjx1Rzt2zR7C7JEdcbW35Gh8OrMWHWTkFztYdSRRfWtRaky+NPBgRACD8z9jj89Dxo6nNZXqeGqsDAbB1N5qfKZrqQJRi+ytzHlkUBt2vDiEN0f74eFozamkTJ5cepihn27jtwNxFKo5KZRb9PXWs5xLycbd1ZXu0/9t7Hh6gKs6nkLev67jqbG7J9AbK3MD206nqGuajFSB0IGNpRn392vFthcH8/64Lvg0teHcxWxe+N9Rgj/eyuLQGPKLivWOqdRDZ5KzmB9yFoD3x3fB2sLM2PH02dUdT9v+DV92p3nC2hoZ46khaGJnyeiuzQFYsjf2Jks3DpUqEEKINkIIK+PPg4UQTwohnE0breGzMjdjcu8WhDw3mE8ndKV1MzviL+Xy2orjDPwwhO93RpFfrA49KZVTUiJ5ZfkxCopLmBDkff2cy2U7nrx7QXYK7SO/gXm9IWKlLrPa1TVXrqz+9UAceYXqS1pl9yB+B4qFEG2B79HGTfr5xi9RKsvczMD4Ht5sfGYQ8yb3oKOHA0kZ+byzOoLnt+Uwf+sZMtVQ48pN/HYwjn1RabjYWfLKqBvMQ9GiDzy4ASYsIsemOaSdhV+nw/fDIGZP7QWug7r6ONPV24nLOYX8eeTakYEan8oWiBLjldHjgM+llM8AahqmGmZmENwZ4Mm6pwbw3fQguvo4k1kAH/6lDTX+2cbTXM5RQ40r10vJzOf9tdqor2+M9sPZ1vLGLxAC/O5mf8+v4M5PwK4ZxO+DH0fA0smQcroWUtdNV1pe1TDglS8QhUKIScA/0UZgBW0EVsUEhBDc4efOisdu4/kga3q1akpGXhFfbI6k39wtzF13Ul3xqVzlndURpOcWMqCdK3cbj6NXhjSYQ8+Z2onsQbPBwrZMx9PTjbLjaXTX5jjZWHAkPp0jcZf1jqOryhaI+4G+wHtSyighRCtgseliKaAVis6uZvz6cF9+mdWHAe1cyS4o5v+2naX/v7fw1p/hnE/P1TumorOtp5JZdSQRawsD743tUr0pRK0cIPhlrVAE3q89dvDHRtnxZG1hxoQgb0C1vFaqQEgpI6SUT0oplwohmgAOUsq5Js6mlNG7tQuLHuzNisf7cUcnd/IKS/hxVzSDPtzKy38cIzY1R++Iig5yCop4bWwo+m4AACAASURBVIU2nMbTd7SnhYvtra3QwQNGf64NL35NxxP7v2s0HU9TjNdE/HkksVHPIFnZLqatQghHIURTtNFVfxRCfGraaEp5uvk4890/g1j31ADuCvCksKSEpftiCf5kK8/+Gqbm121kvtgUSfylXDp6OPBg/1Y1t+Jm7bWOp/v/Au+ekJ0Ca57TDj1FrGrwHU8tXe0Y1L4Z+UUl/O9gvN5xdFPZQ0xOUsoMYDzwo5QyELjDdLGUm+nk6ch/Jvdg4zOD+EcPbXf4j0MJDP1sG4//fIgT52tuonulbgpPTOe7nVEIAXP/EYCFmQkua/LtCw9uhAkLoWkbSD0Dv07TOp5iQ2t+e3XIlZbXxXtjGu2QOJX9jTIXQngCE/j7JLVSB7R1s+eTCV3Z+vxgJvdugblBsOboeUZ+sYOZ/z1AWCM/ydZQFZdIXvnjGMUlkn/2bUk3HxNeliQE+I3RLrQb9fHfHU8/DIdlUxpsx1NwRze8nG2ISc1hx5mLesfRRWULxNtok/uclVLuF0K0BiJNF0upKp+mtrw/Thtq/P5+LbEyN7DpRBJj5+1i2vd72ReVpndEpQYt3BPNkfh0PByteW5YLU0hamYBvR4ydjy9pHU8nVxdpuMpqXZy1BIzg2By7xYALNoTrWsWvVT2JPVvUsoAKeWjxvvnpJT/MG00pTo8nWx4c7Q/O18awiOD2mBnacaOyItM+GYPE77Zw45INdR4fZd4OZeP12sjs741xh8H61ruOLdygOBXKuh4+qBBdTxN7OmDpZk2r0tcWuNrBKnsSWpvIcRyIUSyECJJCPG7EMLb1OGU6mvmYMXskR3ZNXsIT93eDkdrc/ZFpTHt+32Mnb+bjRFJqlDUQ1JK3lgZTnZBMcP93Rnu76FfmNKOpz3Q4U4ozIZtcxtUx5OrvRWjunggJSzd1/jGZ6rsIaYfgVVAc8AL+NP4mFLHOdta8szQ9uyaPYQXR3TAxc6SI3GXeWjhAUZ+sYM1R8+rocbrkfXhF9h0Igl7K3Peuruz3nE0zTrApJ/L73g68We973ia1lc7Wf3L/jgKG9n/lcoWiGZSyh+llEXG209AMxPmUmqYg7UFjw1uy86XhvD6XX64O1px8kImj/98iGGfbeOPQ/EUqaHG67SMvELeXBUOwAvDO+DhVMemEL2q46m11vH0y1TtZHY97njq0aIJnTwdSc0uYP+FxjWAX2ULxEUhxFQhhJnxNhVINWUwxTRsLM14sH8rtr0QzLtjO+PdxIazKdk8++sRgj/Zys97Y9VQ43XUx+tPkZSRTzcf59Lxguqc0o6nfVrHk60rxO39u+PpYv3rbRFClLa8bomt/4fNqqKyBeIBtBbXC8B54B604TeUesrawoypfXwJeX4wH9/bldaudsSl5fLK8mMM+nArP+6KIreg4kKxPLLxXl2qh4Mxl1gUGoO5QfDB+C6YGaoxnEZtKtvxNPDFvzue5vWG1c/Uu46nsd2b42BlzpnLJYQnpusdp9ZUtospVkp5t5SymZTSTUo5Fu2iOaWeszAzcE+gNxufHcRXk7rTwd2BCxl5vPVnBAM+3ML/bTtLVn7Rda9bebZxfZPSU2FxCa/8cQwpYeaA1nTydNQ7UuVZO8KQV+Ffh6DHPwEJB34o0/FUP678t7U05x+BWl/O4tDGc7L6Vi69fLbGUii6MzMIRndtzrqnBrBgWiAB3k5czCpg7rqT9Ju7hS82RZKeo4qCHr7dcY5TSZm0aGrLU7e30ztO9Th6wt1fwqN7oMOoazqevv+742nnZ3V27+LKYb0VhxPIqCvzs5j4/bqVAlHH93GV6jAYBMP8PVj5eD/++0AvgnybkJ5byGebTtPv31v48K+TpKqhxmtNTGo2X2zSjtu/N64zNpZmOie6RW4dYdJSuH8deAVBdjKsefbvjqfMJNj1hd4py9XWzZ5OTQ3kFhbze10Znykr2aTvl/ktvLZx9Xs1MkIIBrVvxsB2ruyNSuM/W86w88xF5m89yw+7ogDYGJHEgHau2rzHSo2TUvLq8uPkF5UwtltzBrRrQI2DvrfBzE3aVKeb3/q746l5d0g9g2WPnnonLNeQFhacSMtnUWgMM25r+ffQ6lIa23lv9ieVW6ayy3aZAIvGYO/XGuQgrUmgBt2wQAghMim/EAjApkaTKHWSEII9Z1PZWWYsmrxCrR32oYUHrlr2kUGtmT3yBlNdKlWyIiyBnWcu4mxrwWt3+ekdp+YJAf5joeOdcPAn2DoXEg8D0GvfY3DqHcr/YITSj6VKfShXtCxVWFb787mSEp61BpEhEW/Vne/IQYeeg5FTwaJmW59vWCCklA41ujWlXnpmaHueGfr3eD9nkrO449NtBHg7cTT+746O73dGEZ6YwXB/D4b5uePmWMf69OuRtOwC3ll9AoBXRnXC1d5K50QmdKXjKWAi7P4Sdn2JeXGONld2HVN6TL7CL+rC+C3+Bn/CDZahCssa/yzOh5xUeM/96iiDZmuTQN2CWznEpDRSbd3sAVj1RH8SL+eyIfwC68OT2BuVyo7Ii+yIvMjrK4/T3ceZ4f4eDPf3oKWrnc6p65f3154gLbuAPq2bcm9gIxnVxtoR+j8D4cs569iXNnc+xd8fhNzkg5IbPFfesjdbX/nr3bZ9Ox179GHAh1spAXa8OARP51ucpOlWFGTDgmBOtJhKp/vervHVqwKh3JLmzjbM6NeKGf1akZZdwKYTSWwIv8D2yIscir3ModjLfLDuJB3cHRju784wfw/8mztWb1rMRmL3mYv872A8luYG3h9XzSlE66s1z4NPb+Kc76GNa93r2JIGc9yd7Rnq35w1x86zdH88zw6tpdF0y7PmefAOIsl5CKY4uKsKhFJjmtpZMiHIhwlBPmTnF7HtdArrwy+w5WQyp5IyOZWUyZdbzuDdxIZhfh4M93cnqGXTun/RVy3KKyzmVeMUok8Et6V1M3udE9Wiw0sg8RA8tAV279c7zQ1N6+urFYh9sfxrSFvTTNZ0M7XwfqkCoVTLmDY3HmLazsqcUV08GdXFk4KiEkLPpbI+/AIbIpKIv5TLD7ui+GFXFC52ltzRyZ3hnd25rY3qiJoXcoaoi9m0dbPnkUFt9I5Tu7JT4N6fwLLuH47s3aop7dzsiUzOYkN4EncGeNZ+iFp4v1SBUKplXDvLSi9raW5gYPtmDGzfjHfGdOZw3GXjeYsLRKfm8MuBOH45EIedpRmDO7ox3N+D4A7Nan+eA52dTsrk/7ZpJ2Y/GN8FS3MdvpXqqf/TeieoNCEE0/r68sbKcBaFRutTIGrh/TJpgRBCjAC+AMyA76SUc8tZZgIwB62/7IiUcrLx8RbAd4CP8blRUspoU+ZVTM9gEAT6NiHQtwmzR3bkdFIW643FIjwxgzVHz7Pm6HkszQzc1taF4f4e3NHJnWYODbiLBygpkbz8xzEKiyWTerWgZ8umekdSbmJcdy/mrjtJ6Lk0Tidl0t694TV9mqxACCHMgHnAUCAe2C+EWCWljCizTDvgZaCflPKSEMKtzCoWAu9JKTcKIewBNRZ1AyOEoIOHAx08HHjy9nbEpeWwISKJ9eEXOBCdxtZTKWw9lcIr4hiBLZqUdkS1cNGxa8RElu6P5WDMJVztrZg9oqPecZRKcLC2YFx3L5bsjWVxaAxvj6kj83PUIFPuQfQCzkgpzwEIIZYBY4CIMss8BMyTUl4CkFImG5f1A8yllBuNj9ePEb2UW+LT1JYH+7fiwf6tuJiVz+YTSawPT2Jn5EUOxFziQMwl3lt7go4eDqXFopOnQ73v8knOyGPuupMAzLnbDyfbxnVorT6b1teXJXtj+eNQAi+O6Ii9VcM6ai9MNe2kEOIeYISUcqbx/jSgt5TyiTLLrABOA/3QDkPNkVL+JYQYC8wECoBWwCZgtpSy+JptzAJmAbi7uwcuW7as2nmzsrKwt697HSMqF+QWSY6lFHMwqYgjKcXklfktaGYj6OFuRqC7OW2dDeRkZ9e792teWB77LxQT0MyMZ3pY1WrBU79fVVNervf35nL6UgnT/SwZ0kKf4n4r71dwcPBBKWVQec+ZstyV91t+bTUyB9oBgwFvYIcQorPx8QFAdyAW+AWYAXx/1cqkXAAsAAgKCpKDBw+udtitW7dyK683FZVLM9L4Z35RMbvPprIh/AIbI5JIySpgfXQR66OLcLW3xN/ZghlD/bitjQtW5nWnI6qi92vLyST2XziAjYUZ8+4fiE/T2j18pn6/qqa8XBlNEnly6WH2pVnx1rQBuuzRmur9MmWBiEc7wXyFN5BYzjKhUspCIEoIcQqtYMQDh8scnloB9OGaAqE0PlbmZgR3cCO4gxvvjpUcir3E+uMXWB9xgbi0XLZlwbYf9+NgZW7siHJncAe3Ornrn51fxOsrtClEnxvWvtaLg1IzRvh74GqvTeG7P/oSvVo1nAYDU/6v2Q+0E0K0AhKA+4DJ1yyzApgE/CSEcAXaA+eAy0ATIUQzKWUKMAQ4gKKUYWYQ9GzZlJ4tm/LqnZ04cT6Tb9aEcirbmpMXMvnzSCJ/HknE0txA/7auDPd3545O7rjUkXGNPt14moTLuXT2cmTGbS31jqNUk6W5gUm9fPhqyxkWhcaoAlEZUsoiIcQTwHq08ws/SCnDhRBvAweklKuMzw0TQkQAxcALUspUACHE88Bmoe2vHQS+NVVWpf4TQuDX3JFx7SwZPHggsak5pe2zB2MvseVkMltOJmMQxwhq2dR4ktsd7yb6fGs/Fp/Oj7uiMAj4YFwA5npciavUmEm9WjAv5Ax/HT9PcmYn3BwaxkCVJt3vllKuBdZe89gbZX6WaDPTXTc7nbGDKcCU+ZSGq4WLLQ8NbM1DA1uTnJnHpohk1odfYPfZi+yLSmNfVBrvrI7Av7ljaUdUe3f7Wjl+XFRcwsvLj1Ii4cH+reji7WTybSqm1dzZhjs6ubMhIolf98fxxJC6N45UddS9A7OKUsPcHKyZ3LsFk3u3ICOvkJCTyWwITyLkVDLhiRmEJ2bw6cbTtHSx1YYq9/egu48zBhONEfXT7miOJ2Tg5Wyj70BvSo2a1teXDRFJLNkbyyOD2jSIvUJVIJRGxdHagjHdvBjTzYu8wmJ2nbnI+vALbDqRTHRqDt9sP8c328/h5mDFUD93hvt70Ke1S40NexF/KYdPNpwG4O0x/tjVwZPnSvX0a+NKK1c7oi5ms/lkMsP9PfSOdMvUb6fSaFlbmHF7J3du7+ROUXEJB2IuaQMKhieRcDmXJXtjWbI3Fgdrc243jhE1sH2zan+oSyl5Y2U4uYXF3NnFk9s7ud/8RUq9YTAIpvbx5Z3VESwOjVEFQlEaCnMzA31au9CntQtv3OVHeGJG6Unu00lZrAhLZEVYIlbmBga0c2WYcYyopnaVG7RweWQBWU3Ps+VkMg7W5rw5ugFOIapwTw9vPlp/kh2RFzmXklXvh2tXBUJRriGEoLOXE529nHhuWAeiLmaXFovDsZfZdCKZTSeSjW22TUrPW3g5VzxN+8qzhexO1kaZeWlERzUdawPlZGvBmK5e/HIgjiV7Y3m9ns8lrgqEotxEK1c7HhnUhkcGtSEpI48NEdqseXvOphJ6Lo3Qc2m89WcEXbycGO6vnbdo63Z9R1RKZj6Bvk2Y3KuFTn8TpTZM6+vLLwfi+O1AHM8P64CNZd25or+qVIFQlCpwd7RmWh9fpvXxJT1X64haH36BradSOJaQzrGEdD7ecJrWrnYMM15rUVisjTBjYSb4YHwXk3VHKXVDZy8nuvk4ExZ3mVVHEpjYs/5+IVAFQlGqycnGgrHdvRjbXeuI2hF5pSMqiXMXs/m/bWf5v21nubIj8fDANg1yzgDletP7+hIWd5mFe2KYEORTb0ccrv+NuopSB1hbmDHUz52P7+3KtD6+Vz13ZcDk/4ScoeXsNaW3zzae1iGpUhtGdfGkia0F4YkZhMVd1jtOtak9CEWpYc8N68BzwzoAWmvr6aQshn++nei5d+qcTKkt1hZmTOjpwzfbzrFoTwzdWzTRO1K1qD0IRTGhK7PmKY3P1N6+CAGrj54nLbtA7zjVogqEoiiKCfg0tSW4gxsFxSX8eiBO7zjVogqEoiiKiVw5H7VkbwzFJaaZvdOUVIFQlFowpo2aZ7oxGti+GT5NbYhLy2X76RS941SZKhCKUgvGtavckBxKw2JmEEzpre1FLNwTrWuW6lAFQlEUxYQmBPlgaW5g6+kUYlNz9I5TJapAKIqimFBTO0vuCvBESliyL0bvOFWiCoSiKIqJXTlZ/ev+OPIKi3VOU3mqQCiKophYNx9nOns5cimnkDVHz+sdp9JUgVAURTExIUTpXsSi0PpzmEkVCEVRlFpwd1cvHK3NCYu7zLH4dL3jVIoqEIqiKLXAxtKMe4N8AFhcT/YiVIFQFEWpJVN6a3NDrDySQHpOoc5pbk4VCEVRlFrSupk9A9q5kldYwm8H6/74TKpAKIqi1KKppeMzxVJSx8dnUgVCURSlFt3e0Y3mTtZEXcxm19mLese5IVUgFEVRapG5mYHJxnMRi/bU7ZPVqkAoiqLUsgk9fbAwE2w6kUTC5Vy941RIFQhFUZRa5uZgzYjOnpRIWLo3Vu84FVIFQlEURQfT+2onq5ftj6WgqETnNOVTBUJRFEUHQb5N6OjhwMWsAv4Kv6B3nHKpAqEoiqIDIURpy+viOnqy2qQFQggxQghxSghxRggxu4JlJgghIoQQ4UKIn695zlEIkSCE+I8pcyqKouhhbHcv7K3M2RedxskLGXrHuY7JCoQQwgyYB4wE/IBJQgi/a5ZpB7wM9JNS+gNPX7Oad4BtpsqoKIqiJ3src8b38ALqZsurKfcgegFnpJTnpJQFwDJgzDXLPATMk1JeApBSJl95QggRCLgDG0yYUVEURVdXhgFffjiBzLy6NT6TKQuEF1B2sJF442NltQfaCyF2CSFChRAjAIQQBuAT4AUT5lMURdFdO3cH+rRuSk5BMcsPJ+gd5ypCStOMBSKEuBcYLqWcabw/DeglpfxXmWVWA4XABMAb2AF0BqYCtlLKD4UQM4AgKeUT5WxjFjALwN3dPXDZsmXVzpuVlYW9vX21X28qKlfVqFxVo3JVjaly7btQxPywfJrbC97rZ4MQotZyBQcHH5RSBpX7pJTSJDegL7C+zP2XgZevWeb/gBll7m8GegJLgFggGrgIZABzb7S9wMBAeStCQkJu6fWmonJVjcpVNSpX1ZgqV0FRsQx6d6P0fWm13H3mYpVffyu5gAOygs9VUx5i2g+0E0K0EkJYAvcBq65ZZgUQDCCEcEU75HROSjlFStlCStkSeB5YKKUstwtKURSlvrMwMzCplzY+U12aTMhkBUJKWQQ8AawHTgC/SinDhRBvCyHuNi62HkgVQkQAIcALUspUU2VSFEWpqyb3aoGZQbA+/AJJGXl6xwFMfB2ElHKtlLK9lLKNlPI942NvSClXGX+WUspnpZR+UsouUsrrTiJIKX+S5Zx/UBRFaUg8nKwZ5udOUYlk2b66MZmQupJaURSljrjS8vrzvhgKi/Ufn0kVCEVRlDqibxsX2jSzIykjn00RSXrHwVzvAKZUWFhIfHw8eXk3P57n5OTEiRMnaiFV1ahcVVOVXNbW1nh7e2NhYWHiVIpSOUIIpvXxZc6fESwKjWFkF09d8zToAhEfH4+DgwMtW7a8aV9xZmYmDg4OtZSs8lSuqqlsLiklqampxMfH06pVq1pIpiiVMz7Qm3//dYrdZ1M5k5xJWzf9/p816ENMeXl5uLi4VPmiE6XhE0Lg4uJSqb1LRalNjtYWjO2uDTqxOFTfyYQadIEAqlUcPtt42gRJlLpGfXFQ6qqpfbRrIn4/GE92fpFuORp8gaiOLzZH6h1BUZRGzL+5E4G+TcjML2JlWKJuOVSBMKHBgwezfv36qx77/PPPeeyxx274uitjqiQmJjJt2rQK133gwIEbrufzzz8nJyen9P6oUaO4fPlyZaKXa8+ePbRq1Ypu3brRr18/7O3t6dChA926dWP69OlVWldJSQlz586t8Hlvb+9byqoo9d2VKUkX7om+MhRRrVMFwoQmTZrEtQMILlu2jEmTJlXq9c2bN2fRokXV3v61BWLt2rU4OztXe31//fUXH3/8MWFhYezatYugoCCWLFlCWFgYCxcurNK6blYgFKWxG9HZAxc7S05eyORQ7CVdMjToLqayWs5eY5Llo+feWeFz99xzD6+99hr5+flYWVkRHR1NYmIi/fv3JysrizFjxnDp0iUKCwt59913GTPm6ukyoqOjGTVqFBEREeTm5nL//fcTERFBp06dyM3NLV3u0UcfZf/+/eTm5nLPPffw1ltv8eWXX5KYmEhwcDCurq6EhITQsmVLDhw4gKurK59++ik//PADADNnzuTpp58mOjqakSNH0r9/f3bv3o2XlxcrV67ExsYGgM2bN/Pss89W+PctKirixRdfZOfOneTl5fHkk08yc+ZMEhISmDhxIllZWRQVFbFgwQL++OMPMjMz6datGwEBAZUqMBcvXuSBBx4gOjoae3t7FixYQOfOndmyZQvPPPNM6TmFXbt2cfny5eu2edttt910G4pSV1iZmzGxpw/zt55l4Z4YAn2b1noGtQdhQi4uLvTq1Yu//voL0PYeJk6ciBACa2trli9fzqFDhwgJCeG555674W7k119/ja2tLUePHuXVV1/l4MGDpc+99957HDhwgKNHj7Jt2zaOHj3Kk08+SfPmzQkJCSEkJOSqdR08eJAff/yRvXv3Ehoayrfffsvhw4cBiIyM5PHHHyc8PBxnZ2d+//13QPtwtrCwwMnJqcKMCxYswM3NjX379rF//37mzZtHbGwsixcvZvTo0YSFhXHkyBECAgKYO3cuDg4OVdr7eP311+nduzdHjx5lzpw5zJgxA4CPPvqIBQsWEBYWxrp167C2ti53m4pS30zu3QKDgLXHznMxK7/Wt99o9iBu9E0fru6fbzl7zU2Xr6wrh5nGjBnDsmXLSr+1Syl55ZVX2L59OwaDgYSEBJKSkvDw8Ch3Pdu3b+fJJ58EICAg4KoPvF9//ZUFCxZQVFTE+fPniYiIuOEH4s6dOxk3bhx2dnYAjB8/nh07dnD33XeXnmMACAwMJDo6GoANGzYwbNiwG/5dN2zYwIkTJ0oPq6WnpxMZGUnPnj15+OGHycvLY+zYsXTt2pWioqp3ZuzcuZM1a7Q9u2HDhjFjxgyys7Pp168fTz/9NJMnT2b48OF4enqWu01FqW+8m9gypKM7m04k8cv+OB4Pblur21d7ECY2duxYNm/ezKFDh8jNzaVHjx4ALFmyhJSUFA4ePEhYWBju7u437ckvry0zKiqKjz/+mM2bN3P06FHuvPPOm67nRnsqVlZWpT+bmZmVfpCvW7eOESNG3HS98+fPJywsjLCwMKKiorj99tsZMmQIW7duxdPTkylTprBkyZIbrqeyua/cf+211/jmm2/Iyspi8ODBREZG1tg2FUVv04wnq3/eG0txSe2erFYFwsTs7e0ZPHgwDzzwwFUnp9PT03Fzc8PCwoKQkBBiYm48BvzAgQNLP+SOHz/O0aNHAcjIyMDOzg4nJyeSkpJYt25d6WscHBzIzMwsd10rVqwgJyeH7Oxsli9fzoABAyrctpSSo0ePlu5ZVGT48OHMnz+/tKicOnWK3NxcYmJi8PDwYNasWcyYMYPDhw9jbq7tvFZlT6Lse7Bp0ya8vb2xs7Pj7NmzBAQE8PLLLxMQEMCpU6fK3aai1EcD2rri62JLwuVcQk4m1+q2G80hJj1NmjSJ8ePHX9XRNGXKFEaPHk1QUBDdunWjY8eON1zHo48+yv33309AQADdunWjV69eAHTt2pXu3bvj7+9P69at6devX+lrZs2axciRI/H09LzqPESPHj2YMWNG6TpmzpxJ9+7dSw8nXevgwYN07979pheWPfzww8TGxpYWEjc3N1auXMnmzZv59NNPsbCwwN7ensWLFwPw4IMPEhAQQFBQULnnIfz9/Uu3OXnyZN5+++3S98De3p4ff/wRgI8//pgdO3ZgMBjo1KkTw4YNY/HixeVuU1HqG4NBMLW3L++tPcHC0Bju8HOvvY1XNNVcfbuVN+VoRETEDSbau1pGRkbpz59uOFXp15la2Vx6eeedd+TSpUuveqwu5CpPVXNV5XfkVjS2KTRvlcp1tUvZ+bL9q2ul70urZVRK1nXP18cpR+utZ4a21ztCnfLaa69x33336R1DURotZ1tL7u7aHIAle2tvSlJVIBRFUeqBKyerfz0QT15hca1sUxUIRVGUeiDA25mu3k6k5xay6kjtjM+kCoSiKEo9MdU4Jeni0No5zKQKhKIoSj0xumtznG0tOBqfzpE40w9mqQpEWTs/g0z954FVFEUpj7WFGROCfABYVAt7EapAlJWVDLu+qLHVpaam0q1bN7p164aHhwdeXl6l9wsKCiq1jkcffZRTp07dcJl58+bV2JXC/fv3JywsrEbWpShKzZvSW5tM6M8jiVzKrtznSHWpC+XK6vcUzO+j/elw6xejuLi4lH7YzpkzB3t7e55//vmrlintNzaUX6u//vrrm86x/Pjjj99yVkVR6gdfFzsGtW/GttMp/HYwjlkD25hsW42nQMypeBRSgKs+gj+pwnUQc9KrHOXMmTOMHTuW/v37s3fvXlavXs1bb71VOl7TxIkTeeONNwBtULqvv/6azp074+rqyiOPPMK6deuwtbVl5cqVuLm58dprr+Hq6srTTz9N//796d+/P1u2bCE9PZ0ff/yR2267jezsbKZPn86ZM2fw8/MjMjKS77777qbDZwDk5ubyyCOPcOjQISwsLPj888/p3r07x44d44EHHqCwsJCSkhJWrFhBs2bNmDBhAomJiRQXFzNnzhzuueeeKr9HiqJUbFofX7adTmFxaCwz+7c22XbU6DXyGwAAC+lJREFUISadRERE8OCDD3L48GG8vLyYO3cuBw4c4MiRI2zcuJGIiIjrXpOens6gQYM4cuQIffv2LR0Z9lpSSvbt28dHH33E22+/DcBXX32Fh4cHR44cYfbs2VUam+jLL7/E0tKSY8eOsWjRIqZNm0ZBQQHz58/n+eefJywsjP3799O8eXPWrl1Ly5YtOXLkCMePH2fo0KHVe4MURalQcEc3vJxtiE3LYXtkCssjTXOoqRHtQdz4m35mZiYOVgZYEAz9n4Zuk00ap02bNvTs2bP0/tKlS/n+++8pKioiMTGRiIgI/Pz8rnqNjY0NI0eOBLShuHfs2FHuusePH1+6zJXxlXbu3MlLL70EaOM3+fv7Vzrrzp07eeGFFwBtfKTmzZtz7tw5brvtNt59911iYmIYP348bdu2JSAggNmzZzN79mxGjx591dhQivL/7d17jFTlGcfx749ll7UuSAIWoYuiBROu1YVQLGqoFKTQyB/eaBRK09ZLG6UlpcWm6cWg6SWt9UKCl9JKFZDaYikqiFxKWhRdKIK6VZBiim7DpSrd4G3p0z/OuzCMZ5iZ3TlzJuzzSTZ75px35jz77My857xn5nldaVR1EdeMOZOfrXqFh559nadf+5DSXT09xs8gMj3+bagflXjnABydiwGiSXruvPNO1q1bx/bt25k0aVJsye6ampqjy5mluLO1lezObGMdmNM2132nT5/O8uXL6datGxMmTGDjxo0MHjyYxsZGhg4dypw5c7j99tvbvV/nXG5XjepPTVUX1iZY4dU7iKDri8vgza0w+edl3/ehQ4fo3r07PXr0oLm5mdWrV5d8HxdeeCHLli0DYMeOHbFDWLlkltluamqiubmZc845h927dzNw4EBmzZrFlClT2L59O2+88QZ1dXVMnz6d2bNns3Xr1pL/Lc456F3XjcnDz6ADx355dZ4hpjy6HD4AV/4Wak7N27bUGhoaGDJkCMOGDftIye5Suemmm5gxYwYjRoygoaGBYcOG5Zw+9NJLL6W6uhqAiy66iIULF3L99dczfPhwqqurWbRoETU1NSxevJglS5ZQXV1Nv379mDdvHps2bWLu3Ll06dKFmpoaFixYUPK/xTkXmX7BWTy2LSq78X7rEbp1rSrp46sjQw+VZNSoUdbY2HjcuqamJgYPHlzQ/TOnHK0kpYqrtbWV1tZWamtr2blzJxMnTmTnzp1HJ+5JK65SKzauYp4jHbFhwwbGjRuX+H6K5XEVpxLiumPNq9y5dmfedrPGDyqoMrWkLWY2Km6bn0F0Ei0tLYwfP57W1lbMjHvvvbfdnYNzLj3fmnDucW/8bx/+gPNuXcOen0wp+b78HaKT6NmzJ1u2bEk7DOdcifX8WE3+Ru2U6EVqSZMkvSJpl6S5OdpcJellSS9JWhzWnSfpmbBuu6Sr2xvDyTKE5krPnxvOnVhiZxCSqoD5wARgL/C8pBVm9nJGm0HALcBYM3tL0sfDpsPADDPbKakfsEXSajMrqnxhbW0tBw8epFevXnnnU3adi5lx8OBBamtr0w7FuYqV5BDTaGCXme0GkLQUmApkfr7ya8B8M3sLwMz2hd+vtjUwszcl7QNOB4rqIOrr69m7dy/79+/P2/a9996ryDcLj6s4xcRVW1tLfX19whE5l7ypn6xO5HGT7CA+Afwr4/Ze4NNZbc4FkPQ3oAr4kZmtymwgaTRQA7yWvQNJ1wHXAfTp04cNGza0O9iWlhbq6uraff+keFzFKTau118vz8QrLS0tHXp+JsXjKk6lxjWh7wfJxNVWTbTUP8CVwAMZt6cDd2e1WQksB6qBs4k6kZ4Z2/sCrwBj8u1v5MiR1hHr16/v0P2T4nEVx+MqjsdVnJMxLqDRcryvJnmRei/QP+N2PZA9kepe4E9m9qGZ/TN0BoMAJPUAHge+b2bPJhinc865GEl2EM8DgySdLakGmAasyGrzGPBZAEm9iYacdof2y4FFZvb7BGN0zjmXQ6LfpJY0GfgV0fWFhWZ2m6RbiU5pVij6aNEvgEnAEeA2M1sq6VrgN8BLGQ8308xyTnUmaT/QkQHl3sCBDtw/KR5XcTyu4nhcxTkZ4zrLzE6P23DSlNroKEmNluPr5mnyuIrjcRXH4ypOZ4vLq7k655yL5R2Ec865WN5BHHNf2gHk4HEVx+MqjsdVnE4Vl1+DcM45F8vPIJxzzsXyDsI551ysTtVBSFooaZ+kF3Nsl6S7Qnny7ZIaKiSucZLekbQt/PygTHH1l7ReUlMovT4rpk3Zc1ZgXGXPmaRaSc9JeiHE9eOYNt0kPRLytVnSgAqJa6ak/Rn5+mrScWXsu0rS3yWtjNlW9nwVEFOaudojaUfYb2PM9tK+HnPV4DgZf4CLgQbgxRzbJwNPAgLGAJsrJK5xwMoU8tUXaAjL3YFXgSFp56zAuMqes5CDurBcDWwmq44Y8HVgQVieBjxSIXHNBO4p93Ms7Hs2sDju/5VGvgqIKc1c7QF6n2B7SV+PneoMwsw2Av85QZOpROU9zKL6Tz0l9a2AuFJhZs1mtjUs/xdoIqrSm6nsOSswrrILOWgJN6vDT/anQKYCD4blR4HxSniykgLjSoWkemAK8ECOJmXPVwExVbKSvh47VQdRgLgS5am/8QQXhCGCJyUNLffOw6n9+URHn5lSzdkJ4oIUchaGJrYB+4A1ZpYzX2bWCrwD9KqAuAAuD8MSj0rqH7M9Cb8CvgP8L8f2NPKVLyZIJ1cQdexPSdqiaLqDbCV9PXoHcby4I5NKONLaSlQv5VPA3URFDstGUh3wB+CbZnYoe3PMXcqSszxxpZIzMztiZucRVS8eLWlYVpNU8lVAXH8GBpjZCOBpjh21J0bSF4B9ZnaiydLLmq8CYyp7rjKMNbMG4PPANyRdnLW9pPnyDuJ4hZQoLzszO9Q2RGBmTwDViqrfJk5SNdGb8MNm9seYJqnkLF9caeYs7PNtYANRIcpMR/MlqStwGmUcXswVl5kdNLP3w837gZFlCGcscJmkPcBS4BJJD2W1KXe+8saUUq7a9v1m+L2PqOL16KwmJX09egdxvBXAjPBJgDHAO2bWnHZQks5oG3dVNMNeF+BgGfYr4NdAk5n9MkezsueskLjSyJmk0yX1DMunAJ8D/pHVbAXwpbB8BbDOwtXFNOPKGqe+jOi6TqLM7BYzqzezAUQXoNeZ2bVZzcqar0JiSiNXYb+nSuretgxMBLI/+VjS12OSU45WHElLiD7d0lvSXuCHRBfsMLMFwBNEnwLYBRwGvlwhcV0B3CipFXgXmJb0m0owlmgmwB1h/Brge8CZGbGlkbNC4kojZ32BByVVEXVIy8xspTJK3BN1bL+TtIvoSHhawjEVGtfNki4DWkNcM8sQV6wKyFe+mNLKVR9geTju6QosNrNVkm6AZF6PXmrDOedcLB9ics45F8s7COecc7G8g3DOORfLOwjnnHOxvINwzjkXyzsI5/KQdCSjcuc2SXNL+NgDlKOKr3Np61Tfg3Cund4NZSqc61T8DMK5dgq1+X+qaK6F5yQNDOvPkrQ2FHNbK+nMsL6PpOWhgOALkj4THqpK0v2K5mp4KnzbGUk3S3o5PM7SlP5M14l5B+FcfqdkDTFdnbHtkJmNBu4hqgJKWF4Uirk9DNwV1t8F/CUUEGwAXgrrBwHzzWwo8DZweVg/Fzg/PM4NSf1xzuXi36R2Lg9JLWZWF7N+D3CJme0OxQP/bWa9JB0A+prZh2F9s5n1lrQfqM8o9NZWrnyNmQ0Kt78LVJvZPEmrgBaiSrSPZczp4FxZ+BmEcx1jOZZztYnzfsbyEY5dG5wCzCeqFrolVDN1rmy8g3CuY67O+P1MWN7EsaJy1wB/DctrgRvh6AQ+PXI9qKQuQH8zW080eU1P4CNnMc4lyY9InMvvlIyqsQCrzKzto67dJG0mOtj6Ylh3M7BQ0hxgP8cqas4C7pP0FaIzhRuBXKWYq4CHJJ1GNAnMHWEuB+fKxq9BONdO4RrEKDM7kHYsziXBh5icc87F8jMI55xzsfwMwjnnXCzvIJxzzsXyDsI551ws7yCcc87F8g7COedcrP8DAFLYHmF4tIUAAAAASUVORK5CYII=\n",
|
| 703 |
-
"text/plain": [
|
| 704 |
-
"<Figure size 432x288 with 1 Axes>"
|
| 705 |
-
]
|
| 706 |
-
},
|
| 707 |
-
"metadata": {
|
| 708 |
-
"needs_background": "light"
|
| 709 |
-
},
|
| 710 |
-
"output_type": "display_data"
|
| 711 |
-
}
|
| 712 |
-
],
|
| 713 |
-
"source": [
|
| 714 |
-
"# Plotting our loss charts\n",
|
| 715 |
-
"import matplotlib.pyplot as plt\n",
|
| 716 |
-
"\n",
|
| 717 |
-
"history_dict = history.history\n",
|
| 718 |
-
"\n",
|
| 719 |
-
"loss_values = history_dict['loss']\n",
|
| 720 |
-
"val_loss_values = history_dict['val_loss']\n",
|
| 721 |
-
"epochs = range(1, len(loss_values) + 1)\n",
|
| 722 |
-
"\n",
|
| 723 |
-
"line1 = plt.plot(epochs, val_loss_values, label='Validation/Test Loss')\n",
|
| 724 |
-
"line2 = plt.plot(epochs, loss_values, label='Training Loss')\n",
|
| 725 |
-
"plt.setp(line1, linewidth=2.0, marker = '+', markersize=10.0)\n",
|
| 726 |
-
"plt.setp(line2, linewidth=2.0, marker = '4', markersize=10.0)\n",
|
| 727 |
-
"plt.xlabel('Epochs') \n",
|
| 728 |
-
"plt.ylabel('Loss')\n",
|
| 729 |
-
"plt.grid(True)\n",
|
| 730 |
-
"plt.legend()\n",
|
| 731 |
-
"plt.show()"
|
| 732 |
-
]
|
| 733 |
-
},
|
| 734 |
-
{
|
| 735 |
-
"cell_type": "code",
|
| 736 |
-
"execution_count": 15,
|
| 737 |
-
"metadata": {},
|
| 738 |
-
"outputs": [
|
| 739 |
-
{
|
| 740 |
-
"data": {
|
| 741 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEGCAYAAABy53LJAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOydeVxVRf/H38OOsgkqLpi44IoIirjgRuZSPVmWa2qZqS3aZpm2/NqenqdsMSvLSktbFNJ8XMp9IzNzwwUVRREVARVF2Xfu/P44lysiu1zuBeb9ep0XnDkzc77ncrnfO/Od+XyFlBKFQqFQKMqLhakNUCgUCkXNQjkOhUKhUFQI5TgUCoVCUSGU41AoFApFhVCOQ6FQKBQVwsrUBlQHDRs2lJ6enpVqm56eTv369avWoCpA2VUxlF0VQ9lVMWqrXWFhYdeklI1uuyClrPVH9+7dZWXZuXNnpdsaE2VXxVB2VQxlV8WorXYBB2Uxn6lqqkqhUCgUFUI5DoVCoVBUCOU4FAqFQlEh6kRwvDhyc3OJjY0lKyur1HrOzs6cPHmymqwqP8quimFOdtnZ2eHh4YG1tbWpTVHUclafyWHgwKrvt846jtjYWBwdHfH09EQIUWK91NRUHB0dq9Gy8qHsqhjmYpeUksTERGJjY2nVqpWpzVHUctaezeVzI/RbZ6eqsrKycHNzK9VpKBRVjRACNze3Mke6CoU5U2cdB6CchsIkqPedojqQRlQ+r9OOozJ8tvW0qU1QKBSKMll5MBaALScuV3nfynFUkM+3n6mSfgYOHMjmzZtvKZs/fz7PPvtsqe0cHBwAuHTpEiNHjiyx74MHD5baz/z588nIyDCc33fffSQlJZXH9GL5559/aNWqFYGBgfj6+uLg4ED79u3x9fXlscceq1BfOp2ODz/8sNQ6Bw4cQAjB9u3bK22zQlGb+GzraTznrDccr64KB2Daz2G3lFfFl986Gxw3NePGjSMkJIShQ4caykJCQvj444/L1b5p06b89ttvlb7//PnzmTBhAvXq1QNgw4YNle4LYNOmTXzyyScMGTIER0dHBg4cyCeffIK/v3+F+ypwHHPmzCmxTnBwMH379iU4OJhBgwbdiemlkpeXh5WV+jdRmD8vDW7HS4PbARCVkMY98/4E4MS7Q6lvW7XvYTXiMBEjR47kjz/+IDs7G4Dz588THx9P3759SUtLY9CgQXTr1o0uXbqwdu3a29pfuHABb29vADIzMxk7diw+Pj6MGTOGzMxMQ71nnnkGf39/OnfuzNtvvw3AF198QXx8PEFBQQQFBQHg6enJtWvXAJg3bx7e3t54e3szf/58g30dO3Zk6tSpdO7cmSFDhtxyn+3bt3PPPfeU+Lx5eXnMnDmTgIAAfHx8WLx4MQBxcXH07dsXX19fvL292bNnD3PmzCE1NbXE0YpOp2PVqlX8+OOPbNy4kZycHMO1JUuW4OPjQ9euXXniiScAuHz5suH16dq1K/v27SMqKgpfX19Duw8//JD3338fgL59+/LGG2/Qv39/FixYwNq1a+nZsyd+fn4MGTKEhIQEQFup9fjjj9OlSxd8fHxYs2YN3377LbNmzTL0u3DhQl599dUSXxeFwhisPHjR8HtVOw1QIw4APOesN0r98x/eX+I1Nzc3AgIC2LRpEw8++CAhISGMGTMGIQR2dnasXr0aJycnrl27Rq9evRg+fHiJQdWFCxdSr149wsPDCQ8Pp1u3boZr//nPf3B1dSU/P59BgwYRHh7O888/z7x589i5cycNGza8pa+wsDCWLFnCvn37kFLSs2dPBgwYQIMGDThz5gzBwcEsWrSI0aNHs2rVKiZMmMC1a9ewtrbG2dmZ1NTUYm387rvvaNy4Mfv37yc7O5tevXoxZMgQgoODeeCBB5g9ezb5+flkZmYSEBDA4sWLOXLkSLF97dq1iw4dOtC6dWsCAwPZtGkTw4cP5+jRo8ydO5c9e/bg6urK9evXAZg+fTpBQUG88sor5OXlkZGRYfjwL4mUlBR27doFwI0bNwyv/zfffMOnn37K3Llzeeedd2jUqBHHjh1DSklSUhJWVlb4+vrywQcfYGVlxZIlS1i6dGmp91IoqpLcfB2rDsUZ9R5qxGFCCqarQJumGjduHKCthnj99dfx8fHhnnvuIS4ujitXrpTYz65du5gwYQIAPj4++Pj4GK6tWLGCbt264efnx4kTJ4iIiCjVpt27dzNixAjq16+Pg4MDDz/8MH/99RcArVq1MnxL7969O+fPnwdgy5YtDBkypNR+t2zZwpIlS/D19aVnz54kJSVx5swZevToweLFi3n33Xc5fvy4IYZTGsHBwYwdOxaAsWPHEhwcDMCOHTsYM2YMrq6uAIafoaGhTJ48GQArKyucnJzKvEdB/wAxMTEMGTKELl26MG/ePE6cOAHAtm3bmD59OqCtlGrQoAGOjo7079+fjRs3cuLECSwtLenUqVOZ91MoqoqdpxK4lpZNm0bGU+tVIw5KHxkU3TjmOWd9qfUrwkMPPcTMmTM5dOgQmZmZhpHCsmXLuHr1KmFhYVhbW+Pp6Vnmuv/iRiPnzp3jk08+4cCBAzRo0IBJkyaV2U9pS/hsbW0Nv1taWhqmqjZu3MjMmTPL7Pfrr78uNh4RGhrK+vXrGT9+PK+99hpjxowpsZ/c3FxWr17Nhg0bePfdd9HpdCQlJZGeno6UssRRWdFyKysrdDqd4TwrK+uWWEZhKerp06fz+uuvc99997Ft2zZD4L6k+02ZMoV58+bh6elpmC5TKKqLFfrVVGN6tOBE5Fmj3EONOEyIg4MDAwcOZPLkyYbRBkBycjKNGzfG2tqanTt3cuHChVL76d+/P8uWLQPg+PHjhIdrqylSUlKoX78+zs7OXLlyhY0bNxraODo6Fjut1L9/f9asWUNGRgbp6emsXr2afv36lXhvKSXh4eG3xAuKY+jQoXz99dfk5eUBEBkZSWZmJhcuXKBJkyZMmzaNSZMmcfjwYcMHeEHdwmzZsoUePXpw8eJFzp8/T0xMDA888ADr1q3jnnvuISQkxDBFVfAzKCiI77//HoD8/HxSUlJo0qQJ8fHx3Lhxg6ysLNavL3n6MTk5mebNmyOl5McffzSUDxkyhAULFhhehxs3bgAQGBjI2bNnWblyZalOUKGoahJSstgZmYCVhWCEnwcjvGyMch/lOEzMuHHjOHr06C1TI+PHj+fgwYP4+/uzbNkyOnToUGofzzzzDGlpafj4+PDRRx8REBAAQNeuXfHz86Nz585MnjyZwMBAQ5tp06Zx7733GoLjBXTr1o1JkyYREBBAz549mTJlCn5+fiXeOywsDD8/vzI3tT311FN4eXkZguDPPPMMeXl5bN++3WDn2rVree655wB48skn8fHxuS04HhwczIgRI24pe+SRR1i+fDk+Pj68+uqr9O/fH19fX0OQesGCBWzfvp0uXbrg7+/PqVOnsLOz4/XXX6dHjx4MHz681Omkd955hxEjRjBgwADc3d0N5W+//TZXrlzB29sbX19fw5QeaIsf+vfvj7Ozc6mvi0JRlfzvcBz5OsndHRrTyNG27AaVpbgkHbXtKC6RU0RERLkSmaSkpNxy3nL2H+VqZ2yK2mUq/v3vf8vg4GDDubnYVZTqtmvo0KEyNDS0xOsF77/amgDIWCi7Skan08mgj3fKlrP/kNsiLkspjZfIScU4KsgLg7xMbYJZ8eabb5raBLMiMTGR3r170717dwYMGGBqcxR1iLALN4i+lk5jR1sGtLs922tVohxHBSnYYKNQFIebmxunTytZGkX18+sBbe/GI909sLI0bhRCxTgUCoWihpOWncf6Y5cAGNXdw+j3U45DoVAoajjrw+PJyMknwNOV1o3K3gt1pyjHoVAoFDWcgmmqUf7GH22AchwKhUJRo4lKSOVQTBL1bSy536dptdzTqI5DCDFMCBEphIgSQhQrdSqEGC2EiBBCnBBCLNeXBQkhjhQ6soQQD+mvLRVCnCt0rfSdZ1XF7s8gtWTZj4qSmJiIr68vvr6+NGnShObNmxvOC4v2lcYTTzxBZGRkqXW++uorw+bAquDKlStYWVkZNtQpFArTUrBT/IGuzahnUz3rnYx2FyGEJfAVMBiIBQ4IIdZJKSMK1fECXgMCpZQ3hBCNAaSUOwFffR1XIArYUqj7WVLKymuKV4a0BPj7cxj23yrpzs3NzSDi98477+Dg4MArr7xyS52CNdMWFsX79yVLlpR5nwItpari119/pXfv3gQHB/Pkk09Wad+FUXLmCkXZ5Obr+N8hzXGM7tGi2u5rzBFHABAlpYyWUuYAIcCDRepMBb6SUt4AkFIWJ1k6Etgopcwo5lr1EfgCHF1epaOO4oiKisLb25unn36abt26cenSJaZNm2aQRn/vvfcMdfv27cuRI0fIy8vDxcWFOXPm0LVrV3r37m1Qf33zzTcN0uh9+/Zlzpw5BAQE0L59e/bs2QNAeno6jzzyCF27dmXcuHH4+/uXqEwbHBzM/PnziY6O5vLlm5nF1q9fT7du3ejTp49B8LA42fECWwsICQlhypQpAEyYMIGXX36ZoKAgXn/9dfbu3Uvv3r3x8/MjMDCQM2e0JFp5eXm89NJLeHt74+Pjw9dff83mzZsZNWqUod+NGzcyevToO/57KBTmzI5TCVxLy6FtYwf8WriU3aCKMOZXuubAxULnsUDPInXaAQgh/gYsgXeklJuK1BkLzCtS9h8hxFvAdmCOlDK76M2FENOAaQDu7u6Ehobecr2wBLjjpyUHlByLFnxa/n0cqS/HlqtednY21tbWpKamkpaWRkREBAsWLDAkdXrjjTdwdXUlLy+P+++/n3vvvRcvLy/y8/NJT08nNTWV5ORkevTowRtvvMFrr73GwoULmTlzJtnZ2WRlZZGamkp+fj7Z2dls376dDRs28NZbb7F69WrmzZuHq6sru3fv5tixY/Tr18/Qb2EuXLhAYmIi7dq1Y/jw4fz0008888wzXLlyhaeffppNmzbRvHlzkpOTSU1N5fXXX8fZ2Zk9e/YYZMcL+iz4mZmZSW5uLqmpqeTm5hIZGcmaNWuwsLAgOTmZjRs3YmlpydatW5kzZw5Lly7lm2++ISYmht27d2Npacn169dxcXFhxowZnD9/Hjc3NxYtWsT48eMN98nPzy9R8t0UZGVlERoaSlpa2m3vTXNA2VUxTGXXwjBNtLR7gxz+/PPP264byy5jOo7ixIuKSq9aAV7AQMAD+EsI4S2lTAIQQjQFugCFc6y+BlwGbIDvgNnAexRBSvmd/jr+/v5y4MCBt1w/efLkLaq3xqC8/dva2mJra4ujoyMODg60adOGwvb+9NNPfP/99+Tl5REfH8+FCxfo0KEDlpaW1K9fH0dHR+zt7XnkkUcA6N27N3/99ReOjo7Y2tpiZ2eHo6MjlpaWjB07FkdHR/r27cvbb7+No6MjBw4cYPbs2Tg6OtKnTx86d+5s6Lcwv//+O+PGjcPR0ZHHHnuM6dOn8+qrr7Jt2zbuvvtuOnfuTGpqKi1btgQ0ufc1a9YY+nFycjIIFxaU2dvbY21tjaOjI9bW1owbN86g73T9+nUef/xxzp69qfDp6OjI7t27efHFFw0jl4K+JkyYwO+//8748eM5evQoK1euxNLSErhd5djU2NnZ4efnR2hoKEXfm+aAsqtimMKuhJQsjm3ZgZWFYNao/jR0uF2bylh2GdNxxAKFJ908gPhi6uyVUuYC54QQkWiO5ID++mhgtf46AFLKS/pfs4UQS4BbAwOV4Z3kEi8ZPnBy0uG7IOj7Ivg+ese3LI3Ckt5nzpzh888/Z//+/bi4uDBhwoRipdFtbG6qYFpaWharLAs3pdEL15GlSKkXJjg4mMTERINCbHx8POfOnStRXry4cgsLi1vuV/RZCj/7G2+8wdChQ3n22WeJiopi2LBhJfYLMHnyZIPzHDNmjMFpKBS1kVWHNEHDoZ3di3UaxsSYMY4DgJcQopUQwgZtymldkTprgCAAIURDtKmr6ELXxwHBhRvoRyEI7ZPjIeC4UawvyvpXwMPf6E6jKCkpKTg6OuLk5MSlS5fYvHlz2Y0qSN++fVmxYgUAx44dKzbZU0REBPn5+cTFxXH+/HnOnz/PrFmzCAkJITAwkB07dhjk3wvkzIuTHbewsDBkE9TpdKxevbpEuwrkzIFbsugNGTKEhQsXkp+ff8v9WrRoQcOGDfnwww+ZNGnSnb0oCoUZI6U0pIcdU41B8QKM5jiklHnADLRpppPACinlCSHEe0KI4fpqm4FEIUQEsBNttVQigBDCE23EUnTibpkQ4hhwDGgIvG+sZzBweBnEH4L7Pjb6rYrSrVs3OnXqhLe3N1OnTr1FGr2qeO6554iLi8PHx4dPP/0Ub2/v2+TAly9fXqKcubu7OwsXLuTBBx+kT58+jB8/HihZdnzu3LkMGzaMQYMG4eFRcnxp9uzZzJo167Znfuqpp2jSpIkhh3iB0wN49NFHadWqFe3aKU0xRe3lYCFBw/5exhU0LJbiJHNr23HHsup/fSbllfLVry6qUiY8NzdXZmZmSimlPH36tPT09JS5ubkmt6syPPXUU3Lp0qW3lZvarqIoWfXKoezSeHnFEdly9h9y7saTpdZTsuqmpO+LprbAqKSlpTFo0CDy8vKQUvLtt9/WyD0Uvr6+NGjQgC+++MLUpigURiM1K5f14XpBQ//qn6YCJauuAFxcXAgLCzO1GXdMSXtPFIraxPrwS2Tm5hPQypVWDeuX3cAI1GmtKlnO1UQKRVWi3neKO+FXfVB8tIlGG1CHHYednR2JiYnqn1hRrUgpSUxMxM7OztSmKGogZ66kcjgmCQdbK+7r0sRkdtTZqSoPDw9iY2O5evVqqfWysrLM8p9c2VUxzMkuOzu7UleTKRQlsUI/2niga9NqEzQsjjrrOKytrWnVqlWZ9UJDQ/Hz86sGiyqGsqtimKtdCkV50QQN4wDTTlNBHZ6qUigUiprE9pMJJKbn4NXYAd9qFDQsDuU4FAqFogZQeKd4cZI71YlyHAqFQmHmXEnJYmdkAlYWghF+zU1tjnIcCoVCYe6sOhSLTsI9Hd1xq2ZBw+JQjkOhUCjMGCklK/XpYU0haFgcynEoFAqFGXPg/A3OXUvH3cmWfl4NTW0OoByHQqFQmDW/HtCC4iO7e2BlaR4f2eZhhUKhUChuIzUrlw3H9IKG3c1jmgqU41AoFAqz5Q+9oGHPVq54mkjQsDiU41AoFAozpWCaytQ7xYuiHIdCoVCYIaevpHLkYoGgYVNTm3MLynEoFAqFGbLiQIGgYTPsbSxNbM2tKMehUCgUZkZOno7VhzVBQ3PZu1EYozoOIcQwIUSkECJKCDGnhDqjhRARQogTQojlhcrzhRBH9Me6QuWthBD7hBBnhBC/CiFsjPkMCoVCUd3sOHWFxPQc2rk70NXD2dTm3IbRHIcQwhL4CrgX6ASME0J0KlLHC3gNCJRSdgYKJ/fOlFL66o/hhcrnAp9JKb2AG8CTxnoGhUKhMAUr9DvFR/ubXtCwOIw54ggAoqSU0VLKHCAEeLBInanAV1LKGwBSyoTSOhTaK3g38Ju+6EfgoSq1WqFQKEzI5eQsQiMTsLY0D0HD4hDGSp0qhBgJDJNSTtGfTwR6SilnFKqzBjgNBAKWwDtSyk36a3nAESAP+FBKuUYI0RDYK6Vsq6/TAtgopfQu5v7TgGkA7u7u3UNCQir1HGlpaTg4OFSqrTFRdlUMZVfFUHZVjKq064+zOfx2Jhd/d0tm+N1Z1so7tSsoKChMSul/2wUppVEOYBSwuND5RODLInX+AFYD1kArIBZw0V9rpv/ZGjgPtAEaoY1iCtq3AI6VZUv37t1lZdm5c2el2xoTZVfFUHZVDGVXxagqu3Q6nRzw0Q7ZcvYfcsepK3fc353aBRyUxXymGnOqKlb/wV6ABxBfTJ21UspcKeU5IBLwApBSxut/RgOhgB9wDXARQliV0qdCoVDUSPafu875xAyaONnR36uRqc0pEWM6jgOAl34VlA0wFlhXpM4aIAhAPw3VDogWQjQQQtgWKg8EIvQecCcwUt/+cWCtEZ9BoVAoqo1fD94UNLS0ML+geAFGcxxSyjxgBrAZOAmskFKeEEK8J4QoWCW1GUgUQkSgOYRZUspEoCNwUAhxVF/+oZQyQt9mNjBTCBEFuAHfG+sZFAqForpIKSxo6O9hYmtKx6rsKpVHSrkB2FCk7K1Cv0tgpv4oXGcP0KWEPqPRVmwpFDWe1WdyGDjQ1FYozIE/jl4iK1dHr9autHQzH0HD4lA7xxUKE7L2bK6pTVCYCQXTVOYmaFgcynEoFAqFiYm8nMrRi0k42lpxr7d5CRoWh3IcCoVCYWJW6EcbD/ian6BhcSjHoVCYgIycPJ5Ysh+A6KtpJrZGYUpuETSsAdNUYOTguEKhuMlnW0/z+fYzt5Xf/emft5y/MMiLlwa3qy6zFCZm+8krXE/Pob27Iz5mKGhYHMpxKBTVxEuD2zG1f2sm/bCfgxdu4O5ky5WUbABWP9sHv7samNhChSkomKYa3cM8BQ2LQ01VKRTVREpWLo99v4+DF27Q1NmOX6f1Nlybu+lUgYyOog5xOTmLP09fNWtBw+JQjkOhqAaSM3OZ+P1+DsUk0dzFnl+n9cazobZW39nemr3R1/nz9FUTW6moblYdikUnYXAnd1zr15zUQspxKBRGJikjhwmL93H0YhIeDewJmdaLu9zqGa5PD2oDwNxNkeh0atRRV9Dp5M1pqhoSFC9AOQ6FwojcSM9h/OJ9HItL5i7Xevz6VG9auN50Gg+2seax3p40dbbj5KUU1h1Vmp11hX3nrnMhMYOmznb0M2NBw+JQjkOhMBLX03N4dPE+TsSn4OlWj5BpvWjuYn9LnRFeNthZW/LSPdoqqk+3RpKTpzOFuYpqZmUNETQsDuU4FAojcC0tm0cX7eXkpRRaN6xPyLTeNCviNArzcLfmeDV24OL1TJbvu1CNlipMQUpWLhuO6wUNu9esaSpQjkOhqHKupmYz7ru9nLqcSptG9QmZ1osmzqVncrOytGDW0PYAfLkjirTsvOowVWEifj8aT1aujt6t3W6Jd9UUlONQKKqQhJQsxn73D2cS0vBq7EDItN40dipf+s/Bndzp3rIBiek5LP4r2siWKkzJigMFezfMWz69JJTjUCiqiMvJWYz9bi9nr6bToYkjwdN60cjRttzthRDMHtYBgEW7ormWlm0sUxUm5NTlFI7GJuNoVzMEDYtDOQ6Fogq4lJzJ2O/+IfpaOh2bOrF8ai8aOpTfaRQQ0MqVuzs0Jj0nnwU7ooxgqcLUrDgQC8Dwrs2wszZ/QcPiUI5DobhD4pIyGfPtXs4nZtC5mRPLp/S8o81crw5rjxCwbN8FYhIzqtBShanRBA01xzGmR80LihegHIdCcQdcvJ7BmG//IeZ6Bj4eziyf0osGd7gDuEMTJ0b4NSc3XzJva2QVWaowB7advMKNjFw6NHGkS/OaIWhYHMpxKBSVJCYxg7Hf7SX2RiZdW7jw85M9ca5nXSV9zxzcDhtLC9YejSciPqVK+lSYnsI7xWuKoGFxGNVxCCGGCSEihRBRQog5JdQZLYSIEEKcEEIs15f5CiH+0ZeFCyHGFKq/VAhxTghxRH/4GvMZFIriOH8tnTHf/UNcUiZ+d7nw85MBONtXjdMA8GhQjwm9WiIlfLT5VJX1qzAdl5Iz2XX6KjaWFjVK0LA4jOY4hBCWwFfAvUAnYJwQolOROl7Aa0CglLIz8KL+UgbwmL5sGDBfCOFSqOksKaWv/jhirGdQKIoj+moaY7/by6XkLPxbNuCnyQE42VWd0yhgxt1tcbC1IjTyKv+cTazy/hXVy6qwm4KGdzqdaWqMOeIIAKKklNFSyhwgBHiwSJ2pwFdSyhsAUsoE/c/TUsoz+t/jgQSgZom5KGolUQma07ickkVAK1d+nByAoxGcBoBrfRum9W8NwIdKdr1GowkaakHx0TU4KF6AMNabUQgxEhgmpZyiP58I9JRSzihUZw1wGggELIF3pJSbivQTAPwIdJZS6oQQS4HeQDawHZgjpbxtwbsQYhowDcDd3b17SEhIpZ4jLS0NBweHSrU1JsquilEVdsWl6Zi7P4uUHElHVwte7GaHrdWdzVOXZVdWnuTVXZmk5Ehm+Nri36R6cq/V5r+jMSjLrpOJ+cw9kIWrneCTAfZYVFN8405fr6CgoDAppf9tF6SURjmAUcDiQucTgS+L1PkDWA1YA62AWMCl0PWmQCTQq0iZAGzRHMpbZdnSvXt3WVl27txZ6bbGRNlVMe7UrlOXUmS397bIlrP/kOMX7ZUZ2XnVZtdPe87JlrP/kEGf7JS5eflVct+yqK1/R2NRll0vhhyWLWf/IT/dfKp6DNJzp68XcFAW85lqzKmqWKDwmMwDKKoZHQuslVLmSinP6Z2EF4AQwglYD7wppdxb0EBKeUn/TNnAErQpMYXCaETEpzBu0V4S03Po59WQxY/7Y29TfRu3xgbcRUu3ekRfTee3sNhqu6+iakjOzGXDMU3QcGQNFDQsDmM6jgOAlxCilRDCBhgLrCtSZw0QBCCEaAi0A6L19VcDP0kpVxZuIIRoqv8pgIeA40Z8BkUd53hcMo8u3sv19BwGtm/Eosf8q323r7WlBS8P0QQQ5287Q2ZOfrXeX3Fn/H40nuw8HX3a1ExBw+IwmuOQUuYBM4DNwElghZTyhBDiPSHEcH21zUCiECIC2Im2WioRGA30ByYVs+x2mRDiGHAMaAi8b6xnUNRtjsUmM37xPpIycrm7Q2O+ndjdZBIR/+rSlM7NnLicksXSPedNYoOictTULH+lUWakTQgxA1gm9SufKoKUcgOwoUjZW4V+l8BM/VG4zi/ALyX0eXdF7VAoKsqRi0lM/H4fqVl5DO7kzoJH/bC1Mp2ukIWFJoD42A/7WRgaxaMBd1XZZkOF8Th5KYVwvaDhMO8mpjanyijPiKMJcEAIsUK/oa/mbndUKMrBoZgbTFysOY1hnZvw1aPdTOo0Cujn1ZA+bdxIycrj6z/rmADi7s8g9YqpragwBaONB31rrqBhcZTpOKSUb6QMSHwAACAASURBVKIFrL8HJgFnhBD/FUK0MbJtCkW1c/D8dR77fj+p2Xnc36UpXz7qh42VeSjzFJZdX/r3eS4lZ5rYomokLQH+/tzUVlSI7Lx81hyOA2CM/10mtqZqKdd/hH5K6bL+yAMaAL8JIT4yom0KRbWy/9x1HvthP2nZeTzQtRmfj/XF2tI8nEYBXVu4cH+XpmTn6fh82xlTm1N9BL4AR5djk13hGXOTsS0igRsZuXRs6oR3cydTm1OllPlfIYR4XggRBnwE/A10kVI+A3QHHjGyfQpFtfDP2UQe/2E/GTn5POTbjM9Gd8XKzJxGAS8PaYelhWDFwYtEJaSZ2pzqwdYJGrbH7/BsOL0FasAu+ptBcY8aLWhYHOXZhtoQeFhKeaFwodR2cf/LOGYpFNXH31HXePLHA2Tl6ni4W3M+HtkVSwvz/Udv3ciBMT1asHxfDB9vPsW3E2/f2FtryM+Do8th5weQGo89wPJRt9YZMAeCXjOFdSUSn5TJrjOaoOFDvjVb0LA4yvOVagNwveBECOEohOgJIKU8aSzDFIrqYNfpq0xeqjmN0f4eZu80CnhhkBd21hZsPnGFQzE1Z/qm3EgJpzbAN4Gw7jlIjQcrW666BUA9t5v1WgZCq36ms7MEVoXFIiUM7lzzBQ2LozyOYyFQeDycri9TKGo0oZEJTPnpINl5OsYFtODDh31qhNMAcHeyY3JgKwDmbqxlAogx++CHYRAyDq6eApeW0LIPdH6EE13egBfCYdDbYOcCF/6GpffDj8Ph4n5TWw7oBQ3DtGmqMbVo70ZhyuM4hCz0rpRS6ijfFJdCYbbsOHWFaT+FkZOnY0Kvu/jPQ12wqCFOo4CnBrTBpZ41+85dJ/T0VVObc+dcjYSQ8fDDELi4VxtZ3PsR9HsZMq7D/Z9o9WwdoN9MePEYDHxdi3+c+xO+Hwy/jIS4QyZ9jL3RiVy8nkkzZzsC2zY0qS3GojyOI1ofILfWHy8A0cY2TKEwFtsirvDUz2Hk5Ot4vHdL/v2gd41zGgDO9tZMH9gWgI82RaLT1dBRR0q8Nh31dS849QdY14P+r8LzR6DnU5B5A0YtBZv6t7azc4KBs+HFcOj3Ctg4QNRWWBQEwePg8jGTPE5BUHykf4saM4KtKOVxHE8DfYA4NFHCnujlyhWKmsam45d5ZlkYufmSJwI9eWd45xq94mVi75Y0c7bj5KUU1h0tqiFq5mQlw7Z34YtucOgnQID/ZM1h3P2G5hgA+r4IjTuW3I99Axj0f9oUVuALYGUPkRvgm76w4jFIqL5QbHJmLhuPXwZgVHePartvdVOeDYAJUsqxUsrGUkp3KeWjUp9wSaGoSWw4dokZyw+Rmy+Z2q8Vb/2rU412GgB21pa8OLgdAJ9siSQ7rwYIIOZlwz9fweddYfc8yMuETg/C9P3wr8/A0b1y/dZ3g8HvaSOQXs+CpS1ErIWve8OqKXDN+Lvt1+kFDQPbutHCtXYIGhZHefZx2AkhpgshvhZC/FBwVIdxCkVVse9SHs8FHyZPJ3l6QBtev69jjXcaBTzSzQOvxg7E3shk+b4YU5tTMjodHP0VvvSHza9rU1AtA2HKdhj9EzRsWzX3cWgMwz6AF45AjylgYQXHVsJXPWD1M3D9XNXcpxhWHKh9gobFUZ6pqp/R9KqGAn+i5dVINaZRCkVVsvZIHN8czSZfJ5kR1JbZw9rXGqcBYGkhmDVUk11fsCOKtOw8E1tUBCkhaht82x9WT4PkGGjcCR5dAZPWg4eR9qE4NYP7P4XnD0O3x0FYaHtCFvjDuuch6WKV3i4iPoVjcck42VkxtHPtETQsjvI4jrZSyv8D0qWUPwL3A12Ma5ZCUTX871AsL/16BAk8P8iLl4e0q1VOo4DBndzp3rIBiek5LNplRmtX4g7BT8Phl0fgyjFwag4Pfg1P74Z2Q6E6/hYuLWD4FzDjIHR9FKQODv0IX/jB+pe14HwVcFPQsHmtEjQsjvI4jlz9zyQhhDfgDHgazSKFoopYefAiL688ik7CiLbWzBxcO50GaAKIc+7VBBAX/xXN1dRs0xqUeBZWTtJWOJ3bpe25GPxveC4M/MaDhQk+WF1bwYiFWiylyyjQ5cGBxfC5L2x6TRNSrCS5OsmaI3pBwx61e5oKyuc4vhNCNADeRMvgFwHMNapVCsUdErI/hldXhSMlzBrangfb1r7du0Xp4enKoA6NSc/JZ8EOEwkgpl2F9a/AVwFwYrUWoA58QYs3BD4P1vamsaswDb3gkcXw7D9aUD4/G/Z+rQXrt74F6YkV7vLwlXySMnLp1NQJ7+bORjDavCjVcQghLIAUKeUNKeUuKWVr/eqqb6vJPoWiwizbd4E5/zuGlDDn3g5MD6qioGsNYNaw9ggBy/fHEJOYUX03zk6D0A/hC184sEibDvKdAM8f0lY62TeoPlvKS+OOWlD+qb+g/X2Qm6FJt3/uA9v/rQXvy8muOC2uNNq/9i7BLUypjkO/S3xGNdmiUNwxP/1znjdWa2no37y/I08PqFtpYzo0cWKEX3Ny8yWfbo00/g3zc2H/Is1hhH4AOWnQ7l54+m946CtwrgEfpE19YFwwTN0BbQdrz/DXJzC/K4TOhayUUpvHJWVy4lq+JmjoV/sEDYujPFNVW4UQrwghWgghXAsOo1umUFSQJX+f4621JwB461+dmNKvtYktMg0zB7fDxtKCtUfiORGfbJybSKlNRX0VABtegfSr4NEDJm2AR0PAvZNx7mtMmneHCb/B5C3QagBkJ0Pof7URyF+faqOqYlgVFosEhnR2x6Ve7Z8ShfI5jsnAdGAXEKY/Dpanc32q2UghRJQQYk4JdUYLISKEECeEEMsLlT8uhDijPx4vVN5dCHFM3+cXKpWtArSA8Lu/RwDw3oOdmdy3lYktMh0eDeoxsXdLQJMiqXLO7YJFd2vB7+vR4NYWRv8MT24Fz8Cqv191c1dPeHydtlT4rj7alNX297QYyJ4vIefmFKBOJw2rqepCULyA8uwcb1XMUeZXOSGEJfAVcC/QCRgnhOhUpI4X8BoQKKXsDLyoL3cF3kaTNwkA3tYH6EFT5p2Gls7WCxhWvkdV1Fa+/fMs76/XZCXef8ibx3p7mtYgM2B6UFscba348/RV9py9VjWdXj6uiQj++ADEHwIHd22n97P7oNPw6llaW5149oUnNsDENdpoKuMabHlTm5bb+w3kZvFPdCKxNzJxsxMEtqmdgobFUabKrRDiseLKpZQ/ldE0AIiSUkbr+wkBHkRblVXAVOArKeUNfZ8F6+GGAlullNf1bbcCw4QQoYCTlPIffflPwEPAxrKeQ1E7+WpnFB9vjkQI+GBEF8YG1K7czpXFtb4N0/q35tOtp5m7KZI1z7pVfilyUgzs+A+E/wpIsHGEvi9osh5FhQdrG0JAmyBoPRDObIWd/4FLR2DTbNjzBXH2o7HGj77N7WukUGZlEWXp+Ashvix0agcMAg5JKUeW0W4kMExKOUV/PhHoKaWcUajOGuA0EAhYAu9IKTcJIV4B7KSU7+vr/R+QCYQCH0op79GX9wNmSylvy0QohJiGXozR3d29e0hISKnPWRJpaWk4ODhUqq0xUXbB2qgcVkflIoDJ3jb087A2C7sqgjHtys6TzNqVSUqOZLqvLT2alD8bQlpaGi62Olpe+I3mceuxkHnohBXxze7lQstR5NqYZsmpyf+OUtLw2j48zwfjkH4egFjZkAvNR5DXdhjSwrwyTtzp6xUUFBQmpbxta3+ZTymlfK7wuRDCGU2GpCyKc79FvZQV2nTTQDQpk7/0mwxLaluePrVCKb8DvgPw9/eXAwcOLIfJtxMaGkpl2xqTumyXlJL5286wOuoMQsAnI7vySBlKpHX19Uqof57/W3uCjXFWvDSqf/nyqOdkEB38Kq3j12kBYoAuo7C4+008GnhiynVS5vF3DALdq/y5djHNDs/HyyIOj/hFkLkNBs7RNheaYoNjMRjr9SpPcLwoGWgf9mURCxSOFnkARff2xwJrpZS5UspzQKS+75Laxup/L61PRS1GSsm8raf5fPsZLAR8Ntq3TKdRlxkbcBeebvWIvprOyrDY0ivn52ny5l92p/W5nzWn0eZueGqXtmGugWe12FwjsLDg49iODM2ZS1j3uWTYN4Mb52D1U1pekeOrNFHHWkp51HF/F0Ks0x9/oH24ry1H3wcALyFEKyGEDTAWbed5YdYAQfr7NATaoSWJ2gwMEUI00AfFhwCbpZSXgFQhRC/9aqrHymmLohYgpeSjzZF8uSMKSwvB52P96sy6+cpibWnBy0M0AcT5206TmVOM7Hox+b1THVprQeGJq6Fp12q22vw5EZ/M8bgUHOxs6DxsKgd6LNA0uFxawrXT8Ntk7fWMWKe9vrWM8kzIfVLo9zzggpSyjK8uIKXME0LMQHMClsAPUsoTQoj3gINSynXcdBARQD4wS0qZCCCE+Dea8wF4ryBQDjwDLAXs0YLiKjBeB5BS8sHGU3y3KxorC8EX4/y4r0tTU5tVI7i/S1O+3XWW43EpLNlzjmcHFtpJH7NPk9m4uFc7d2kJg94i7JorA9sEmcbgGsDKg9pH4EN+mqChtLDUNLi6jIIjy2DXJ5AQASsmQhMfCHqj+kQdq4HyOI4Y4JKUMgtACGEvhPCUUp4vq6GUcgOwoUjZW4V+l8BM/VG07Q/AbXk/pJQHAe9y2K2oJUgp+fcfJ/nh73NYWQgWPNqNYd61W7a6KrGwEMwe1oGJ3+9nYehZHg24C5f0c9rehFN/aJXquWnpWv0ng5UNhIaa1GZzJis3n9WHNUHD2/JuWNmA/xPg+yiE/ahtHLwcDsFjoLk/BL2uTf/VcAdSnhjHSqDwZF2+vkyhMDpSSt79PYIf/j6HtaVg4YTuymlUgn5ejQhs60a9rATOL3my+PzevZ7WPvgUpbI14grJmbl0blaKoKGVLfScpok7Dv0v1G8EcQfhl4dhyb3aJsoaTHlGHFZSypyCEylljj5moVAYFZ1O8ta64/yyNwYbSwsWTujGoI6VTCta18lK5jO3dTheXIT91RyksET4T4YBs8FROeKKULBTvFxZ/qztofd06D4J9n+niSjG/KNtovTsB3e/CXf1Mq7BRqA8I46rQojhBSdCiAeBKtqKav6sPpNTdiVFlaPTSd5Yo3caVhZ8+1h35TQqQ6H83o2PfoW9yGFDfgCftP1Jn99bOY2KEHsjg91R17CxsuBB32blb2hTH/q+BC+EQ9CbYOcM5/+CH4bCzw9DbJjxjDYC5XEcTwOvCyFihBAxwGzgKeOaZT6sPZtbdiVFlaLTSeb8L5zg/THYWlmw+DF/gto3NrVZNYsS8nvHj/yd5/NfYuFxQVSCygBdUX4Li0VKGNq5SeUEDe2cYMAszYH0f1XbhX92Oyy+G5aPgUtHq95oI1AeraqzUspeaHpTnaWUfaSUUcY3TVEXyddJZv0WzoqDsdhZW/DDpB70b9fI1GbVHMrI793Muz9jerRAJ40kgFiL0emkYTXVmPJMU5WGvQvc/Qa8GK6NRKzrwelN2t/t1wlwJaLsPkxIefZx/FcI4SKlTJNSpur3VrxfHcaZmoSULFObUKfI10leWXmUVYdisbe2ZMmkAALb1h3huDumnPm9Xxjkhb21JVsirhB2ofzJiuo6e84mEpeUSXMXe/q0cauaTuu5wj3vaCOQ3jPAyg5O/g4L+2h7Qa6erpr7VDHlmaq6V0qZVHCiFyS8z3gmmQ8jvt4DwPjFe5m76RSbjl/mUnImZel7KSpOXr6Ol349wurDcdSzsWTpEz3oXVX/nLWdCub3buxkx+S+ngDM3XRKvZ/LSUFQfJS/R9ULGjo0gqH/0Va3BUwDS2tt9/nXPeF/T2l/YzOiPKuqLIUQtlLKbND2cQC2xjXLNHyml7Ioyt9RifwddTMPcT0bS/q0aYhvC2d8PFzo6uGCc72SBfYUpZObr+PFkCOsP3aJ+jaW/Dg5AH9PlSusTNKuwp9zIWwJ6PK0/N69ntamPspI1frUgDYs2xfD/nPXCY28SlAHFUMqjeSMXDaduIwQMNKYEjdOTeG+j7U87bs+gcM/Q3gIHFup7Q0Z8Cq4mF4BujyO4xdguxBiif78CeBH45lkOl4a3I6XBre7pcxzznq+m9ido7FJhMcmc/RiEilZeWw7eYVtJ6/crOdWj64tXPDxcMG3hTOdmzljZ20eQmfmTE6ejueDD7PpxGUcba1YOjmA7i3NMD+1OZGdBv8s0CcVSgNhoeX3Dnqt3KlaneysmRHUlvfXn2TuplMMaNeoTsmCV5S1R+PIydPRz6shHg3qGf+Gzh7wwHzo+yL8+TEcDdacyNEQ6DYR+r0CzqaT2ymPOu5HQohw4B40ddpNQEtjG2ZODOnchCGdtWWLUkrOJ2Zw9GISR2OTOHoxiRPxKZxPzOB8YgZrj2iai5YWgvbujnRt4UxXD82htHN3KJ86aR0hJ0/H9OWH2BpxBUc7K35+sie+LVxMbZb5kp8LYUu1UUb6Va2s3TAY9HalUrVO6NWSH3af49TlVNYejWOEnxKLLIlfDxRMU1Vzlr8Gnlru9n4ztb97+Ao4+AMcXqbtUO87Exyrf5l6ecXjL6PtHh8NnANWGc0iM0cIQauG9WnVsL5BYC83X0fk5VTDiORobBKnr6QScSmFiEspBO/X3nR21hZ4N3PWj0yc8W3hwl2u9SqfYKcGk52Xz7O/HGL7qQSc7a355cmedPEwTY4Hs0dKiFijSYRcj9bKPHrAPe/eUapWO2tLXhrcjlm/hfPJ5tPc16UptlZqlFyU43HJnIhPwdnemiGdTLSXyK0NPPwd9HsZQj/Q8r3v+0aTNQmYAoEvQv3qW0hSouMQQrRDU7QdByQCv6IlflLKZ0WwtrTAu7kz3s2debSnNv+YkZPHifgUvSPRHErM9QwOXrjBwUIrWVzqWevjJPqRSQtnGjvamepRqoWs3Hye+SWMnZFXcamnOY0SpRvqOud2wda3tVStoOX3HvQ2dHygSvSOHu7mwaK/ojl9JY1le2PqdK72klipD4o/5NvM9NPPjdrDqKXaVFXoB5pszJ4v4cAP0PMp6POctlJr92fQ9VGjmVHaiOMU8BfwQMG+DSHES0azxEx5sE3lgt71bKzo4elKj0JB3uvpOYQXipUcjU3iWloOu05fZdfpq4Z6zZztDPGSri2c6dLcGUe72hF8z8rNZ9rPYew6fZUG9axZNqUXnZo5mdos8+Pycdj2DkRt1c4d3LUkQX4TtRU3VYSlhWDW0A5M/ekgC3ZGMcrfo9a816qCrNx81uinn0f3qOZpqtJo4g1jl0H8Ydj5AZzZDLvnwYHFWkrf9KuavIndEKPcvjTH8QjaiGOnEGITEELxGfhqNSO8qk6Wy7W+DQPbN2agfhe0lJL45CzCLyZxJDaJ8IvJHItLJj45i/jky2w8fhnQpz1u5GCY3vLxcCFXV/OWUGbm5DP1p4PsjrqGW30blk3tSYcmddRplPSN0AT5ve/p2Bj/lg04eOEGi/46x8wiC0TqMlv0gobezZ3o3MwMR8XN/GD8Crh4QMuHHr0T/vwQbB0hPw+7bsYREi/RcUgpVwOrhRD1gYeAlwB3IcRCYLWUcotRLKpDCCFo7mJPcxd77tXnltDpJNHX0jhyMZlwffA94lIKUQlpRCWk8b9DmpyzpYDOEbv1gXfNobRu5IClma6MycjJ48mlB/knOpGGDrYET+2Jl7ujqc0yHWkJt34jzLiuSXDv/w7yc8DCGnpMgf6vGH3uWgjB7Hs7MOqbf1j8VzQTe9WptS+lsrIigoampEUPeGwNXNijffG4sBuAHgdfgOYW0HVsld6uPKuq0oFlwDIhhCswCpgDKMdhBCwsBG0bO9K2saNhvXh2Xj6nLqXqV3ElczQ2ibMJaYTHJhMem2xoW9/Gki76WElBAL65i73Jg+/p2Xk8sfQA+89dp7GjLcun9qJtYweT2mRyAl+Ar3th69MV/poHu+ffkt+boDfAtfriDT08XbmnY2O2nUzgyx1nuNsMv1xXN7cIGnatIZkmW/aBSX9osbGNr2J59ZSWznZ1IXnBAXO0pdt3QHlXVQGgz8L3rf5QVBO2VpZ0baE5A3prZRu37cS5VRdDvCQ8Npm4pEz2Rl9nb/R1Q9uGDjaGTYo++qXBrvWrTxU/LTuPJ5bs58D5G7g72RI8tRetG9VxpwHaSqlGHelx4HnQ6aVtWgfB4HdNlqp11tAObD+VwPJ9MXj3rd0LNMrDyoOaoOGwzk1q1gZfIcDDH6TkQouHafnE92BRtdsAKuQ4FOaDvZWgT5uG9Glzcxrjamq2YXrraGyyIfi+41QCO04lGOq1cLXXRiX6kYl3cyfq2VT9WyE1K5fHf9jPoZgkmjrbETy1F54NjTNPXyPIStZ0iMJX6BP5yFv/AaN3wrc7q+QbYWVo38SRh/08WHUolv+dyWF0tVtgPuh0kt/C9IKG5hQULy/rXwEPf865jKZlFTsNUI6jVtHI0ZZBHd0NeSuklFy8nqkPvGuruI7HpXDxeiYXr2fyR/glACwEtHN3xMdD22PS1cOF9k0csa7EZsXVZ3IYOBCSMzWnceRiEs1d7Ame2ou73Kphx625kZcNZ7bCsRUQuQnys7VyC2uwsiW20QA8nvylyr8RVpaXBnvx+9F49l7K53hccp1dJv332WvEJWXi0cCe3q1rmGba4WXa8u2pO2DPAaPcwqiOQwgxDPgcsAQWSyk/LHJ9EvAxEKcvWiClXCyECAI+K1S1AzBWSrlGCLEUGAAUTO5PklIeMd5T1FyEENzlVo+73OoxvKuWdCYvX8eZhDTCY5MMAfhTl1MNxwq9bLStlQWdmznpJVS0eImnW/0yZSnWns3lvYxcJv6wj/DYZDwaaE6jhWsdcho6HcTs0UYWEWu0kQYAQsv61mUUnPsTrOyIchmNh5k4DQCPBvWY2Lsl3+8+x0ebI/lpcoCpTTIJBf8Ho7q3qHlSLOlXtb0eRlqFB0Z0HEIIS+ArYDAQCxwQQqyTUhYVmv9VSjmjcIGUcifgq+/HFYji1mD8LCnlb8ayvTZjZWlBx6ZOdGzqxJgeWllWbr5hs2J4rDbNde5aOodikjgUYxBGxsnOyrC3pMChuDvdPhc+/vu9HI9L4S7Xeiyf2rN6tH3MgcvHtZHFsVWQEnuz3L0L+IwC70c0DaLDy+DKCaN+I7wTpge1Zfnec+w6fZU9UdfoU8ek7ZMycthcIGjoXwNlWPq+aPRbGHPEEQBESSmjAYQQIcCDQEUzlIwENkopM6rYPoUeO2tLurdscIu4YHJGLuFxWtD9yEUtbpKQms3uqGvsjrqZOdjdydYQK+nUVNuTcTwuhZZu9Qie2otmLvbV/jzVSlIMHPtNUy9NKPTWdr4LuowEn9HQuOOtbarhG+Gd4FrfhntbWfO/M7nM3XSKNdMDTb4yrzpZeyTeIGjYvLa/fyuJMJYWvxBiJDBMSjlFfz4R6Fl4dKGfqvoAuAqcBl6SUl4s0s8OYJ6U8g/9+VK0tUXZwHZgToHke5F204BpAO7u7t1DQkIq9RxpaWk4OJjfKiBT2HUjS0d0so5zyTrOJecTnawjM+/2eu71BHMC7GhgZz5TMFX5elnlptLo6t+4X/kTl+SbziLXypGExoEkNB5AsnMHTbW2Gu2qShKT03jvkAXJ2ZLpvrb0aGIe4dDqeL3e+juTmFQdz3S1pWfT8j23uf4d79SuoKCgMCmlf9FyYzqOUcDQIo4jQEr5XKE6bkCalDJbCPE0MFpKeXeh602BcKCZlDK3UNllwAb4DjgrpXyvNFv8/f3lwYMHK/UcoaGhDBw4sFJtjYk52DVvSyRf7Cg7i/ALg7xuk6uvbu749crNhMiN2sjizFbQ6XPRW9lB+/u0kUWbQWBVsaXO5vB3LI7Q0FAu2rXi/9Ycp3XD+mx5qb9ZKDsb+/U6HpfMv77cjUs9a/a9Pqjcoo/m/He8E7uEEMU6DmN+jYgFCq9j8wDiC1eQUiYWOl0EzC3Sx2i0Xeq5hdpc0v+arc8R8kqVWayoEDOHtGfmkPa3lHnOWc/5D+83kUVVjC5fC2KHr9SW0eakauXCAtrcDV1GQ8d/afIOtZCxPVrw/V/RRF9LZ8XBWIOAZ21mhUHQsLlSCi4FYzqOA4CXEKIV2qqpscAt4jxCiKaFHMFw4GSRPsYBrxXXRmiTrg8Bx41hvKKOIqUmHHdspZa6M+1msi6addNGFp0fNkkOhOrG2tKCV4a2Z8byw8zfdpoRfs2xt6m9H6ZZufmsOawt8DR7iRETYzTHIaXME0LMADajLcf9QUp5QgjxHnBQSrkOeF4IMRzIA64DkwraCyE80UYsfxbpepkQohGa4OIR4GljPYOiDnE9WhtZHFsBiYWm31xbayOLLqOgYVvT2Wci7vNuSpfm0RyLS2bJnnM8O7D2vgabT1wmJSuPLs2dlWJzGRg14iWl3ABsKFL2VqHfX6PIiKLQtfPAbQIxhWMgCsUdkXYVTvxP228RVygGVr+RtnS2y2ho3q1K8l7UVCwsBLOHdWDC9/tYGHqWRwPuwqVe9UnWVCcr9Xs3RtfEJbjVjHkslVDUGiqbv6TayE6DU+u1kcXZnSDztXLr+lpyJJ9R0GogWKp/jQL6ejWkb9uG7I66xtehZ3n9vo5lN6phXLyuCRraWlkw3LeGCBqaEPXfoahSqjJ/SZWRn4tr4kH47WeI3AC5+i1BFlbQdqgWt2h/r9nuqzAHZg/rwO4Fu1m65zyT+njWuv05K/W6VMO8m+Bsb+ZffswA5TgUtRMp4eJ+bWRxYjU+GYUW8LXopY0sOo2A+jVMh8hEdPFw5n6fpqwPv8T8baf5aKRpFHyNQb5O8pt+NdUYFRQvF8pxKGoXVyO1mMWxlZB0wVCcXq8F9XtN0nZzN/A0mXk1mVeGtGfz8cv8FhbLPxRAQAAAF+FJREFU1H6ta00irr+jrhGfnEULV3t61TRBQxOhHIei5pMSry2dDV8Bl8Nvljs2gy5akPvAqUQG9g8ynY21gFYN6zM2oAW/7I3ho82RLHrstn1hNZKCvRs1UtDQRCjHoaiZZCVDxDptKurcX4BeAcHWGToN1+IWLQPBQr/vIDLUVJbWKp4f5MWqsDi2Rlwh7MJ1urd0NbVJd8SN9By2nLiCEPBId7Waqrwox6GoOeRlw5kt2sji9OabuS0sbcBrCPiM0X5aq+x1xqKxox1P9m3Fgp1RzN0Yya9P9arRAohrj8SRk6+jf7tGStCwAijHoTBvdDq48Lc2sohYW3xui07Dwb5Bqd0oqo5pA1qzbN8F9p+/zs7IBO7uUDN30Usp+VXt3agUynEozA8p4cpxbWRxfBWkxN28ZshtMRKc1Xp7U+BkZ830oLa8v/4kH22KZEC7xljWwNjAifgUTl5KoUE9awZ3qpnOz1Qox6EwH5JitNVQ4SvhaiHZstJyWyhMwoReLVny93lOXU5l7ZE4Hu5W876x/3pAL2jopwQNK4pyHArTknEdTqzWHEbMPzfL7V2h8whtKqpFT7PJya3QsLO25KXB7Xhl5VE+3XKa+32a1qgP36zcfNYeUYKGlUU5DkX1k5MBpzdqI4uobYVyW9hDh/s0jag2d1c4t4Wiehnh15xFu6KJvJLKL3tjeLJvK1ObVG4KBA19PJzp2FQJGlYU5TgUVcPuz6DroyVfz8/TclscK8htkaaV15HcFrURSwvBrKHtmfLTQb7aGcVofw8c7WqGXIdh74YabVQK5TgUVUNaAvz9OdgNuVkmJcQf0kYWx1dBesLNa3Ust0VtZVDHxvi3bMDBCzdYtCv6tsRe5sjF6xn8HZWoCRp2bWZqc2okynEoqobAF+DrXtj49oD/b+/Ow6OqzwWOf99AwhoWCQRki8oiCggEIwqlLOIFSsGWvW5w5XKxWtC2CvT6aEvtbbG3RUF8qFAriBbECqWIIEuoiICyyb4JEdnDFgjEhCTv/eOcwBgmkAlz5oTk/TzPPJw553fmvPMbJu+c7f2d/MrZs9gyJ/jYFi0HQI3b/IvVhI2IMKbH7fSbspppn+7nkXsTqBlbzu+wrmqOu7fRwwoaFpklDhMeFeOgTisS1z0Nq89cnm9jW5R4bRNu4v5m8SzdcYxJy/cwrk9zv0MqUE6u8r5bCXfA3XaYqqjsUhVzfVRh50KY0h72JVPu4pnvLj+fCuWrQb1ESxol2HPdmxIl8O7aA6ScOO93OAX6NLCg4S1W0LCobI/DFN2BtbD0xcuX0UZFczi+EzcPnQExFf2NzURUk/hYftymHu+vP8ifluxm0uDWfocUVN5J8QFW0PC62B6HCV3qbpj1ELz5gJM0KtaAmxOheV92N33KkkYp9Uy3JsSUjeJfXx5m66G0a68QYafPZ7HEChqGhaeJQ0S6i8guEdkrImOCLB8iIqkissl9DAtYlhMwf37A/FtEZK2I7BGR2SJiF/tHytkjMH8kvH4P7FwA0RWh47PQ6Vdw8Tz0+rPfERof1a1WgUfbNQRg/KKdPkdzpXl5BQ0b1yxxIxhGmmeJQ0TKAJOBHsAdwGARuSNI09mq2sp9TAuYnxEwv3fA/PHABFVtDJwGHvfqPRjXt2mwbBxMbA0bpgMCiUNh5Ebo8rxzT0b/t2zoVcOTnRsRW64sK/ecYNXeE36Hc4mqXioxYneKXz8v9ziSgL2quk9Vs4BZQJ/reUFx6jd3Ad53Z00HHryuKE3BsjNh9evwaitY+SfIzoBmveHJtfDDVyC2ttOuw9NWQ8oAUL1SDCM6OZdaj1+0E1X1OSLH1kNn2Xn0HNUrRnP/HbX8DueG52XiqAt8E/D8oDsvv74isllE3heRwJ8C5UVknYisEZG85FADOKOq2dd4TXM9cnPhy9kwqS0sHgsZp6DBffD4Uhj4NsQ19jtCU4wNbe/cy7H5YBoLtxz1OxwAZq87AMCPWte7oWpqFVfi1S8CEekP/IeqDnOfPwIkqerPAtrUANJVNVNERgADVLWLu+xmVT0sIrcCy4GuwFlgtao2ctvUBxaqaosg2x8ODAeIj49PnDVrVpHeR3p6OpUrVy7Sul7yJC5Vqp/eyG1fzaDy+f0AnK/YgH23PsrJGm0LdTltqeqvMCipcS0/cJEZ27OIryj8rkMFyobpCqaixJWVo4xKvkBGNvy2fQXqx4b/93JJ/Rw7d+68XlWvHCNYVT15APcCiwOejwXGXqV9GSCtgGVvAf0AAU4AZYNto6BHYmKiFlVycnKR1/VS2OM6uF71rV6qL1ZxHn9qprphpmpOtr9xhYnFFZrrjSsrO0c7/TFZG45eoDPXpIQnKC1aXHM3HNSGoxdo70krwxZHfiX1cwTWaZC/qV4eqvoCaOxeBRUDDALmBzYQkToBT3sDO9z51UWknDsdB7QHtrtvJNlNIgCPAf/08D2UfKf2wZyhMLUz7P8EyleFbuPgZ+uh9UOXx+w2JgTRZaL4xQNNAHh16R4ysnJ8i8UKGoafZzcAqmq2iDwFLMbZm3hTVbeJyDicLDYfGCkivYFs4BQwxF29GfAXEcnFOQ/zB1Xd7i4bDcwSkZeAjcBfvXoPJVp6KnzyMqx7E3KzoUw5uOe/ocMzUPEmv6MzJUDP5nVoUXcfWw6l8eaq/TzZuVHEYzhw8gKffeUWNGxlBQ3DxdM7x1V1IbAw37wXAqbH4hzCyr/eZ8AV5y3cZftwrtgyRZGZDqsnw2cT3dLmAq0ehs5joardFGXCJyrKKYD40LS1TFnxFT9JakD1SpG97WrOemdvo2eLOlS5QUq+3wjszvHSIucifDHNuRdjxf86SaNJd3jiM3hwsiUN44n2jeL4XuM4zmVm8/qKvddeIYy+U9DQDlOFlSWOkk7VGZp1chJ8+AtnTIy6bWHIh/CT2RAf7J5MY8JndPfbAZi++msOncmI2HZX7knlSNq3NLipIvfcYodfw8kSR0m2fyVM6wpzhjgnwWs0ggEzYNhSSOjgd3SmlGhetyq9WtYhKzuXV5bsjth256zL29uoZwUNw8wSR0l0dCvM7AfTe8Gh9VA5HnpNgJ+ugTv6WHlzE3G/fKApZaOEf2w4yJ5j5zzf3qnzWXy8/ShRVtDQE5Y4SpIz38DcETClA+xdAjGx0Pl5p6ZU2/+EMnZy0PgjIa4Sg5Lqk6vw8uJdnm9v3sZDXMxROjapSZ2qVtAw3Gw8jpLgwimnltTnUyEnE6Ki4e7Hncq1leL8js4YAEZ2bcw/1h9iyfZjrEs5RdsEb847qOqlezcG2klxT9gex43sYgZ8OsEpQrj6NSdptOgPT30BPcZb0jDFSq3Y8gz73i2AtwUQtxxKY+fRc9xUKYauzeI92UZpZ4njRpSTTe0jS2FiG1j6a8hMg1s7wfB/Q99pcNMtPgdoTHDDO95K9YrRfJFymuU7j3uyjbzy6T9qXZeYsvYnzgvWqzeSgPG9b981Cc4dhtot4ZG58Og/4eZWfkdozFXFlo++dAf5y4t2kZMb3r2OjKwc5m86DNi9G16yxHGjOLAW/tYDZg2G1J1klK8FP57m7GXc1sXv6IwptIfbNaRutQrsOnaOeRsPhfW1F207wrnMbO6qX42mtWPD+trmMkscxV2w8b27j+fzpNehZX+Iso/Q3FjKR5fh592cAoh/XrKbby+GrwDi5VH+7BJcL9lfneKqoPG9R26CdiPQKLu01ty4Hmxdl6bxsRw6k8HMNV+H5TW/PnmeNftOUT46ih/eZQUNvWSJo7i51vje5av4HaEx161MlPBc96YATE7ey9lvL173a+bdKd6zuRU09JoljuKisON7G1NCdLm9FncnVOf0hYtM/WTfdb3Wdwoa3m0nxb1micNvubmw+T14zcb3NqWLiFN2HWDayv0cP/dtkV/rkz2pHD37LQ1rWEHDSLDE4RdV2LsU/tIRPvgvOHMAajaDwbNh6EKof7ffERrjucSGN3F/s3gyLuYwaVnRy67PWZd3Urw+YrXYPGeJww+HN8KMPjCzLxzbAlXqQp/J8MQqaNrdihCaUuW57k2JEvj75wdIOXE+5PVPpmeyZPsxp6BhG7uaKhIscURS3vjeb3SC/f/ON773wza+tymVmsTH0rdNPbJzlf/7OPQCiPM2HeZijvL9JjWpXbW8BxGa/CxxREJ6Kix8Fl67G7Z94Izvfd9I59La9qMg2qp3mtLtmW5NiCkbxYLNR9hyMK3Q66kq77n3bgy0k+IRY4nDS5npsGI8TGwFn78BuTnQ6iEYuQEe+C1UtJN4xgDcXK0Cj93bEICXF+8s9HqbD6ax69g5alSKocvtVtAwUjxNHCLSXUR2icheERkTZPkQEUkVkU3uY5g7v5WIrBaRbSKyWUQGBqzzlojsD1in+BVoKnB871Xw4Os2vrcxQfy0UyNiy5Vl5Z4TrNp7olDrzF5nBQ394FlPi0gZYDLQA7gDGCwiwQa4nq2qrdzHNHfeBeBRVb0T6A68IiLVAtZ5NmCdTV69h5CpwrZ5MPmeAsb3vtPvCI0ptqpXimFEp9sA+MNHO8m9RgHEjKwc/pVX0NAOU0WUlyk6CdirqvtUNQuYBfQpzIqqultV97jTh4HjQE3PIg2HS+N7PwanvrLxvY0pgqHtE6gVW44th9JYuPXIVdt+tNUpaNiqfjWaxFtBw0gSrwZTEZF+QHdVzTv89Ahwj6o+FdBmCPB7IBXYDTyjqt/ke50kYDpwp6rmishbwL1AJrAMGKOqmUG2PxwYDhAfH584a9asIr2P9PR0KleuXODySukp3LpvBjVOrQcgM6Y6KQmDOFr7fjTKuwEWrxWXXyyu0FhcV0o+cJHp27OIryj8rkMFykZdvjw9MK7fr81g1+lchtwZQ6f6/pYYKamfY+fOnderatsrFqiqJw+gPzAt4PkjwKR8bWoA5dzpEcDyfMvrALuAdvnmCVAOJ6G8cK1YEhMTtaiSk5ODLzh9QPWDEaovVlV9sYrq7+qqrnhZNTO9yNsKS1w+s7hCY3FdKSs7Rzv9MVkbjl6gb69O+c6yvLj2p6Zrw9ELtOnzC/VsRpYPUX5XSf0cgXUa5G+ql4eqDgKBBx7rAYcDG6jqSb28tzAVSMxbJiJVgA+B51V1TcA6R9z3lAn8DeeQWORcOAWL/wcmJcKX70JUWbhnBIzaBN9/FmIqRTQcY0qa6DJR/PIBpwDiq8v2cCEr+4o2c9Y7ByZ6tqhDrBU0jDgvE8cXQGMRuUVEYoBBwPzABiJSJ+Bpb2CHOz8GmAvMUNU5wdYRp67Ag8BWT6L/dAKcO3b5ebDxvZv3s/G9jfFAzxa1aVmvKqnnMvnbqpTvLAssaDjQRvnzhWeJQ1WzgaeAxTgJ4T1V3SYi40Skt9tspHvJ7ZfASGCIO38A0BEYEuSy23dEZAuwBYgDXvLkDaQfh1WvgubAhreDj+/d7682vrcxHhARRnd3CiBOWfEVp89nXVr2ye5Ujp3NJKFGRZKsoKEvvDt7C6jqQmBhvnkvBEyPBcYGWW8mMLOA14zMOKn3jYTX2pJUZi5kuEfYareEbr+xoVqNiYD2jeL4XuM4Vu45weTkvTzfy7ma/z333o3+VtDQN3bHTEFysyHrPBUzDkO1Bja+tzE+yNvrmLH6aw6dyWDWzkyW7rCChn6zxFGQavWhxQBn+swB+GAYjKsOv64Kyb/3NzZjSonmdavyw7tuJisnlwlLdrMoJZuLOUqnprWsoKGPPD1UdUPLOg9HNrHj9lE0GzTO72iMKbV+0a0JH205wgcbDl6aN8BOivvK9jgK8uEvoV5bjtW2Q1PG+CkhrhKDkxqQV4HEKWhYy9+gSjlLHMFsfAcOb4Cef/Q7EmMM8LOujagQ7YxX8+M2VtDQb3aoKpjzqdD/LbuZzxgfTViym1eX7bli/tSV+5m6cv+l56O6NuaZbk0iGVqpZ4kjmA5P+x2BMaXeM92aXJEQEsZ8SMoffuBTRCaP7e8ZY4wJiSUOY4wxIbHEYYwxJiSWOIwxN4w+t1kl3OLAEocx5obxo8YxfodgsMRhjDEmRJY4jDHGhMQShzHGmJCIM6xsySYiqcDXRVw9DjgRxnDCxeIKjcUVGosrNCU1roaqWjP/zFKROK6HiKxT1bZ+x5GfxRUaiys0FldoSltcdqjKGGNMSCxxGGOMCYkljmt7w+8ACmBxhcbiCo3FFZpSFZed4zDGGBMS2+MwxhgTEkscxhhjQmKJAxCRN0XkuIhsLWC5iMhEEdkrIptFpE0xiauTiKSJyCb38UKE4qovIskiskNEtonIqCBtIt5nhYwr4n0mIuVF5HMR+dKN6zdB2pQTkdluf60VkYRiEtcQEUkN6K9hXscVsO0yIrJRRBYEWRbx/ipkXL70l4ikiMgWd5vrgiwP7/dRVUv9A+gItAG2FrC8J/ARIEA7YG0xiasTsMCH/qoDtHGnY4HdwB1+91kh44p4n7l9UNmdjgbWAu3ytfkpMMWdHgTMLiZxDQFei/T/MXfbPwfeDfZ5+dFfhYzLl/4CUoC4qywP6/fR9jgAVf0EOHWVJn2AGepYA1QTkTrFIC5fqOoRVd3gTp8DdgB18zWLeJ8VMq6Ic/sg3X0a7T7yX5XSB5juTr8PdBURKQZx+UJE6gE/AKYV0CTi/VXIuIqrsH4fLXEUTl3gm4DnBykGf5Bc97qHGj4SkTsjvXH3EEFrnF+rgXzts6vEBT70mXt4YxNwHFiiqgX2l6pmA2lAjWIQF0Bf9/DG+yJS3+uYXK8AzwG5BSz3pb8KERf4018KfCwi60VkeJDlYf0+WuIonGC/ZIrDL7MNOLVk7gImAfMiuXERqQz8A3haVc/mXxxklYj02TXi8qXPVDVHVVsB9YAkEWmer4kv/VWIuP4FJKhqS2Apl3/le0ZEegHHVXX91ZoFmedpfxUyroj3l6u9qrYBegBPikjHfMvD2l+WOArnIBD4y6EecNinWC5R1bN5hxpUdSEQLSJxkdi2iETj/HF+R1U/CNLElz67Vlx+9pm7zTPACqB7vkWX+ktEygJVieBhyoLiUtWTqprpPp0KJEYgnPZAbxFJAWYBXURkZr42fvTXNePyqb9Q1cPuv8eBuUBSviZh/T5a4iic+cCj7pUJ7YA0VT3id1AiUjvvuK6IJOF8nicjsF0B/grsUNU/F9As4n1WmLj86DMRqSki1dzpCsD9wM58zeYDj7nT/YDl6p7V9DOufMfBe+OcN/KUqo5V1XqqmoBz4nu5qj6cr1nE+6swcfnRXyJSSURi86aBB4D8V2KG9ftYtsjRliAi8necq23iROQg8CLOiUJUdQqwEOeqhL3ABWBoMYmrH/CEiGQDGcAgr788rvbAI8AW9/g4wK+ABgGx+dFnhYnLjz6rA0wXkTI4ieo9VV0gIuOAdao6HyfhvS0ie3F+OQ/yOKbCxjVSRHoD2W5cQyIQV1DFoL8KE5cf/RUPzHV/D5UF3lXVRSIyArz5PlrJEWOMMSGxQ1XGGGNCYonDGGNMSCxxGGOMCYklDmOMMSGxxGGMMSYkljiMKSIRyQmogrpJRMaE8bUTpICqyMb4ze7jMKboMtxyHcaUKrbHYUyYuWMjjBdnrIvPRaSRO7+hiCxzC+AtE5EG7vx4EZnrFl78UkTuc1+qjIhMFWesjI/du7sRkZEist19nVk+vU1TilniMKboKuQ7VDUwYNlZVU0CXsOpqIo7PcMtgPcOMNGdPxH4t1t4sQ2wzZ3fGJisqncCZ4C+7vwxQGv3dUZ49eaMKYjdOW5MEYlIuqpWDjI/BeiiqvvcootHVbWGiJwA6qjqRXf+EVWNE5FUoF5Acby8svBLVLWx+3w0EK2qL4nIIiAdp7LvvIAxNYyJCNvjMMYbWsB0QW2CyQyYzuHyOckfAJNxKq+ud6vDGhMxljiM8cbAgH9Xu9OfcbkY30PAp+70MuAJuDSwUpWCXlREooD6qpqMM6BQNeCKvR5jvGS/VIwpugoBVXgBFqlq3iW55URkLc6Ps8HuvJHAmyLyLJDK5Qqlo4A3RORxnD2LJ4CCSl6XAWaKSFWcwXkmuGNpGBMxdo7DmDBzz3G0VdUTfsdijBfsUJUxxpiQ2B6HMcaYkNgehzHGmJBY4jDGGBMSSxzGGGNCYonDGGNMSCxxGGOMCcn/Awoi08YnlQkrAAAAAElFTkSuQmCC\n",
|
| 742 |
-
"text/plain": [
|
| 743 |
-
"<Figure size 432x288 with 1 Axes>"
|
| 744 |
-
]
|
| 745 |
-
},
|
| 746 |
-
"metadata": {
|
| 747 |
-
"needs_background": "light"
|
| 748 |
-
},
|
| 749 |
-
"output_type": "display_data"
|
| 750 |
-
}
|
| 751 |
-
],
|
| 752 |
-
"source": [
|
| 753 |
-
"# Plotting our accuracy charts\n",
|
| 754 |
-
"import matplotlib.pyplot as plt\n",
|
| 755 |
-
"\n",
|
| 756 |
-
"history_dict = history.history\n",
|
| 757 |
-
"\n",
|
| 758 |
-
"acc_values = history_dict['accuracy']\n",
|
| 759 |
-
"val_acc_values = history_dict['val_accuracy']\n",
|
| 760 |
-
"epochs = range(1, len(loss_values) + 1)\n",
|
| 761 |
-
"\n",
|
| 762 |
-
"line1 = plt.plot(epochs, val_acc_values, label='Validation/Test Accuracy')\n",
|
| 763 |
-
"line2 = plt.plot(epochs, acc_values, label='Training Accuracy')\n",
|
| 764 |
-
"plt.setp(line1, linewidth=2.0, marker = '+', markersize=10.0)\n",
|
| 765 |
-
"plt.setp(line2, linewidth=2.0, marker = '4', markersize=10.0)\n",
|
| 766 |
-
"plt.xlabel('Epochs') \n",
|
| 767 |
-
"plt.ylabel('Accuracy')\n",
|
| 768 |
-
"plt.grid(True)\n",
|
| 769 |
-
"plt.legend()\n",
|
| 770 |
-
"plt.show()"
|
| 771 |
-
]
|
| 772 |
-
},
|
| 773 |
-
{
|
| 774 |
-
"cell_type": "code",
|
| 775 |
-
"execution_count": null,
|
| 776 |
-
"metadata": {},
|
| 777 |
-
"outputs": [],
|
| 778 |
-
"source": []
|
| 779 |
-
}
|
| 780 |
-
],
|
| 781 |
-
"metadata": {
|
| 782 |
-
"kernelspec": {
|
| 783 |
-
"display_name": "Python 3",
|
| 784 |
-
"language": "python",
|
| 785 |
-
"name": "python3"
|
| 786 |
-
},
|
| 787 |
-
"language_info": {
|
| 788 |
-
"codemirror_mode": {
|
| 789 |
-
"name": "ipython",
|
| 790 |
-
"version": 3
|
| 791 |
-
},
|
| 792 |
-
"file_extension": ".py",
|
| 793 |
-
"mimetype": "text/x-python",
|
| 794 |
-
"name": "python",
|
| 795 |
-
"nbconvert_exporter": "python",
|
| 796 |
-
"pygments_lexer": "ipython3",
|
| 797 |
-
"version": "3.7.4"
|
| 798 |
-
}
|
| 799 |
-
},
|
| 800 |
-
"nbformat": 4,
|
| 801 |
-
"nbformat_minor": 2
|
| 802 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:750dcb623ec0693ac4fc91ea386e409df22841d5b51ef3df09053d361d112d72
|
| 3 |
+
size 84376
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
10. Data Augmentation/10.4 - Data Augmentation Demos.ipynb
CHANGED
|
@@ -1,420 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Data Augmentation\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"### Let's look at our untouched dataset"
|
| 10 |
-
]
|
| 11 |
-
},
|
| 12 |
-
{
|
| 13 |
-
"cell_type": "code",
|
| 14 |
-
"execution_count": 1,
|
| 15 |
-
"metadata": {},
|
| 16 |
-
"outputs": [
|
| 17 |
-
{
|
| 18 |
-
"data": {
|
| 19 |
-
"text/plain": [
|
| 20 |
-
"<Figure size 640x480 with 9 Axes>"
|
| 21 |
-
]
|
| 22 |
-
},
|
| 23 |
-
"metadata": {},
|
| 24 |
-
"output_type": "display_data"
|
| 25 |
-
}
|
| 26 |
-
],
|
| 27 |
-
"source": [
|
| 28 |
-
"# Plot images\n",
|
| 29 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 30 |
-
"from matplotlib import pyplot\n",
|
| 31 |
-
"\n",
|
| 32 |
-
"# load data\n",
|
| 33 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 34 |
-
"\n",
|
| 35 |
-
"# create a grid of 3x3 images\n",
|
| 36 |
-
"for i in range(0, 9):\n",
|
| 37 |
-
" pyplot.subplot(330 + 1 + i)\n",
|
| 38 |
-
" pyplot.imshow(x_train[i], cmap=pyplot.get_cmap('gray'))\n",
|
| 39 |
-
" \n",
|
| 40 |
-
"# show the plot\n",
|
| 41 |
-
"pyplot.show()"
|
| 42 |
-
]
|
| 43 |
-
},
|
| 44 |
-
{
|
| 45 |
-
"cell_type": "markdown",
|
| 46 |
-
"metadata": {},
|
| 47 |
-
"source": [
|
| 48 |
-
"### Random Rotations\n",
|
| 49 |
-
"- As per Keras documentation random is 50%"
|
| 50 |
-
]
|
| 51 |
-
},
|
| 52 |
-
{
|
| 53 |
-
"cell_type": "code",
|
| 54 |
-
"execution_count": 2,
|
| 55 |
-
"metadata": {},
|
| 56 |
-
"outputs": [
|
| 57 |
-
{
|
| 58 |
-
"data": {
|
| 59 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAU4AAAD7CAYAAAAFI30bAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO2dd9QURdaHnyuCCQOIKAKKuogiZkXM6Iphdc2ImLOurovhrMqKn7JrwLCs4egqKoK7ZjGHxYRgRBQjIkFMBEFEBbNofX/M3K6alzdMT+iZnvc+53Cmp6vf6WJ+0923qm4Q5xyGYRhG/ixV6Q4YhmGkDbtxGoZhxMRunIZhGDGxG6dhGEZM7MZpGIYRE7txGoZhxKSoG6eI7CkiU0RkuoicV6pOGZXFdK1dTNvSIIX6cYpIC2Aq0AeYCUwA+jvn3i9d94ykMV1rF9O2dCxdxN/2BKY752YAiMjdwH5AgyKISHP3tp/vnFut0p1oAtM1PmnQFWJqa7o2rGsxQ/WOwGfB+5nZfUbDfFLpDuSB6RqfNOgKpm1cGtS1GItT6tm3xBNKRE4CTiriPEaymK61S5Pamq75UcyNcybQOXjfCZhd9yDn3DBgGJjpnxJM19qlSW1N1/woZqg+AegqIuuISCvgUOCR0nTLqCCma+1i2paIgi1O59xiEfkzMBpoAQx3zk0qWc+MimC61i6mbeko2B2poJOZ6f+Gc26rSnei1JiupmuN0qCuFjlk1DTLLLMMyyyzTKW7YdQYduM0DMOISTGr6oZRVXTq1CnaXnfddQHo2DHjprjhhhsC8Pjjj0fHjB8/PsHeNR9EMl5PLVq0AODXX3+N2lq2bAnAzz//nHzHSohZnIZhGDGxxaFksUWEIll6aT9IateuHQC9e/cG4KCDDoraunXrBngrVP9OrSCAU089FYCRI0cW2y3TNWCTTTYBoHv37gCsssoqUduCBQsAaNOmDQCzZ2fcSN99993oGLVQ58+fD8APP/xQSDdKgS0OGYZhlAq7cRqGYcSk5haHll12WQB+/PHHaJ9OVlsp5PSz4oorRtvXX389AD179gRgzTXXjNoWL14MQKtWrXL+XvcDbLHFFgA88kgmeOarr74qQ4+bB+3bt4+2DzvsMABOP/10AL7++uuobY011gDg888/z/n7b775Jtp++eWXAZgzZw6QO4wfO3YsAHPnzi1Z3wvBLE7DMIyYpMLiVBeGAw88EICPP/44alt//fUBWH311QFYuHAhAMstt1x0zNtvvw14a+PFF18sb4eNkqN67r///tG+XXbZBYBZs2YBue5ITz31FOAt1M022wyA1q1bR8fovtAKNQpjpZVWirY33XRTwGsWWpx67b355psAdO3aFci9pjt3zuQh6devHwDvv+/The65554AvPDCCwCMGTNmib9PArM4DcMwYlK1FucKK6wQbd99992Ad2/QeUzwbibqULvyyisD3koFopC7SZMy+QwOP/zwqO2dd94ped+N0vPLL78A3rEdYMCAAQB89NFHQK7lqL8HdXvR31DojqTzn2uvvTYA7733Xln6XsvodbrRRhtF+3bffXfAz1FedNFFUZtaiDqnqSPE0GVJLdTNN98cgFNOOSVqO+qoowDYcsstAdhtt90AeOWVV6Jjnn/+eaC8eprFaRiGERO7cRqGYcSkyaG6iAwH9gHmOed6ZPe1Be4BugAfA4c458rmy3HDDTcAfjEgjDHWyf+ffvoJgN9++w3IHdLpIsAhhxwCwFlnnRW1HXPMMWXqdXVTDbrGQaNJBg8eHO1T9zJtU7ezsG3nnXcG/HB+1VVXjY7RoXrSCwvlJklt9fo74IADon061NaIrJtvvrnJz6nPvei1114D/EISwJAhQwA4+eSTATj++OMBv6AEsMMOOwDePe2WW26J2m6//fYm+5IP+VicI4A96+w7D3jWOdcVeDb73kgXIzBda5URmLZlpUmL0zk3TkS61Nm9H9A7uz0SeB44t4T94rvvvou2n3vuOQCefPLJJv9OJ//DxSWdXN53332BXAujPof55kCldC0UtSAbcx0KAxxUf10c2nrrrZc4Xi0ZtUK//fbb0nS2wiSprY70dIENYOrUqZmTFJkDQEePeo7ws889N9P1Sy+9FPCLROADIlZbLVPZ95///GfUduSRRwIwaNAgoPAMWYWuqq/unJsD4JybIyLtGzrQqualCtO1dslLW9M1P8rujlSKqnnhE6cpdL4rdMjVOQ+1NJ955pmorblZmqWi2qshqmWqQRP6PsyupC406jj/2We+5LhaO82NuLqqla8WIHg3pPD7LBWqi7qbaQalu+66KzpGtzVrllqXABtssAHg576nTZsG+KxN+VLoqvpcEekAkH2dV+DnGNWF6Vq7mLYlpFCL8xHgaGBI9vXhkvWoROhKOsBaa60FwGOPPQZYyGUjVL2ujRGOMvr06QP4lfbQ0lR0LkwdtsNkEuqYXUOURVtdQdcVcMj1big39SXuWW+99QCfkzVc79DfgQbFFJrYpUmLU0TuAl4BuonITBE5nsyX30dEpgF9su+NFGG61i6mbfnJZ1W9fwNNvy9xX4wEMV1rF9O2/FRtrHqhqGvJGWecEe3ThQGNVzZqg7Zt2wI+Q9Y666wTtV111VU5x9SH5inQRcMaHJ5XhErkvQ1zEJx0UsYp4OCDD16i7dprrwXgwQcfBArvq4VcGoZhxKRmLE6dkN5pp50AXzAKYNSoUQDMmDEDyA27+/LLLwGfTUndmZqrO0q1ouGR6tQMsP322wMwcOBAwC8CwpKWprqdLVq0KNp35plnArn5Ho10oQtBf/jDH6J9moFew0E1dyd4Z/hiyxObxWkYhhGTmrE41WLs1asXkJvpW10mzjnnHCB3XkMzx6uzruby01ejtISZ+XVOUl1CQncirUmjzul77bUXAFtt5au1ar5GneMMddV5bf2cZ599FvAuaQCvv/46UDuhls2BuvXD1PF+7733jo7RUeO9994L5DrHF2tpKmZxGoZhxMRunIZhGDGpmaG6DuX698+4sIVDwuOOOw7wCwRaXgN8yv0jjjgC8NlTNM4VyhNz29zQIfehhx4a7dOInY4dOwJ+8S5Ey8nqsD5c2KvLzJkzo23NAanRQKpzpcvKGsWh0UBa/kav9zAC6D//+Q8Al1xyCVCefBRmcRqGYcSkZixOReNStTwswOzZswEfT3vbbbdFberC8o9//APw2XTUiRbgggsuKGOPaxudzNfFulNPPTVqCx3WwRfjAz+Jr25ISmhZaG5GtVQ1byv4XJBW+je9LLVUxq4LF3o187uWDtasRjfeeGN0jC4Kff/99+XrW9k+2TAMo0apGYvz4YczyV6uvvpqILdcqFoi9Tm1q9VyxRVXAHDQQQcBPocn+NKl6tZk5I+6jSy//PIAzJvns5nVtThD6lqaalWqNQHeytAghjBvq5YTNtKLuqeF5YHV4pw1axbgaxDpvCaUzuWoMcziNAzDiEnNWJyKzlXGRTNBjx49GshN+KBzLUbh1F3dBm8hqpfDpptuusTfqTOzroaHeR81+3dYn8pIP3q9qadMaHFqHk1N0jF27FggGSszJJ98nJ1FZIyITBaRSSIyILu/rYg8LSLTsq9tyt9do1SYrrWJ6ZoM+ZhSi4GznXMbAr2A00SkO1ZuNO2YrrWJ6ZoA+SQyngNodbxFIjIZ6EgVl5ItBF280CG7DhOgNvM0Jq2rLtZoRiLwzswdOnQAfAEtgNNPPx3wDu/bbrst4IMZADbeeGPAx59PnDix2G6mnlq4Xvv27Qv434CW8AZ46KGHALjpppuA+oMmkiDWHGe2VvPmwHis3GjNYLrWJqZr+cj7xikirYFRwBnOuYX5FmSq9jKyimbRad8+83v69NNPozYNF9RM4bVEJXWdMmVKzuv06dOjtnHjxgFw4oknAt7i1Hyr4AvyaUjdJ598ErWpo3xzzaualutV+9WmjZ9yPe+8zCzChhtuCMBll10Wtd13330AfPjhh+XuWqPktVwsIi3JiHCHc+6B7G4rN5pyTNfaxHQtP01anJJ5JNwKTHbODQ2aUl1Kti49evQAoF27dkBuOVm1RmuJatQ1TNKhFuObb74J+Nybao2AL++rLmhhaN7gwYPL29kqpRp1bQzN4H7UUUdF+zbaaCPArzfofCZUz7WYz1B9e+BI4F0ReSu7729kBLg3W3r0U6BvebpolAnTtTYxXRMgn1X1F4GGJkis3GhKMV1rE9M1GWoucigfwkggdYXR1Pvq4qKRLuBdYjQ+1ig/daOBtJTJ5MmTo32aaUlzMoa5PnXYfs011wC5C0dG9aDDci2wBr4MjuaY0NIo4EvdfPDBB0BufoIksVhCwzCMmDRLizN0UVFXFrVCL7roIiB3QtqoPJpl6Ysvvoj2qZuKln+97rrroratt94a8NaLUV2su+66gM9CpotE4HOoamWAW2+9NWo74YQTAB+zbhanYRhGSmiWFmeIur2ETrZG9aLZkgB++OEHACZNmgTkZurXEUToVG9UDxrirOW8Q8tR57NffvllAAYNGhS16Vx1pTNimcVpGIYRE9G5o0ROVsUhlwnxhnNuq0p3otRUi6467wU+P2NCv2/TNSZaVeHss88GcqsraDUHrS4b5tpM8n5FI7qaxWkYhhETu3EahmHExIbqyWJDutrEdC2SsDif5m5NeFheHzZUNwzDKBVJuyPNB77LvqaNdhTf77VL0ZEqxHStTRLTtQzF1sqqa6JDdQAReT2Nw5q09jsp0vr9pLXfSZHW76fc/bahumEYRkzsxmkYhhGTStw4h1XgnKUgrf1OirR+P2ntd1Kk9fspa78Tn+M0DMNIOzZUNwzDiIndOA3DMGKS2I1TRPYUkSkiMl1Ezmv6LyqDiHQWkTEiMllEJonIgOz+tiLytIhMy762aeqzmgtp0NZ0jY/p2sh5k5jjFJEWwFSgDzATmAD0d869X/aTxyRbc7qDc26iiKwIvAHsDxwDLHDODcn+iNo4586tYFergrRoa7rGw3RtnKQszp7AdOfcDOfcz8DdwH4JnTsWzrk5zrmJ2e1FwGSgI5n+jsweNpKMOEZKtDVdY2O6NkJRN84YpnxH4LPg/czsvqpGRLoAmwPjgdWdc3MgIxbQvnI9Ky8xh2ip07a56gq1fc0mqWvBN86sKX89sBfQHegvIt0bOryefVXtByUirYFRwBnOuYWV7k9SxNQVUqZtc9UVavuaTVxX51xB/4BtgdHB+4HAwMaOJfPFN+d/XxT6fSf1L46uwfGV/l4r/a/qdS3wmq3091rpfw3qWkx2pPpM+W3qHiQiJwEnARsXca5a4ZNKdyAP4upqpENXyENb0zWHBnUtZo4zL1PeOTfMZbKUHFDEuYzkiKWrS2HmnGZMk9qarvlRzI1zJtA5eN8JmN3Qwc65J4o4l5EcsXQ1UoVpWyKKuXFOALqKyDoi0go4FHikNN0yKojpWruYtiWi4DlO59xiEfkzmUWfFsBw59ykkvXMqAima+1i2pYOK9aWLFbUqzYxXWsTK9ZmGIZRKuzGaRiGEZOkq1xWLSIZT42llso8S3799ddKdscwmj0rrLACAB06dIj2TZ8+HYDlllsOgB9++CH5jmEWp2EYRmyavcW5zDLLANClSxcAdt55ZwCGDUtrqZX0ssoqq0Tb7dq1A+CzzzKBLj/99FNF+mRUjvXWWw+AE088Mdqn1udDDz0EwFtvvQX43wnAN998U/a+mcVpGIYRE7txGoZhxKRZDtVXWmmlaHuttdYCYODAgQAcdthhAGy55ZbRMSeffHKCvWt+LL/88gD06dMn2nfAAZnUBkOHDgXg9ddfT75jRkXRBaB999032qdTOPvssw8AY8eOBWDWrFnRMQ888AAAjz32WNn6ZhanYRhGTJqVxdmyZUsA1l133Wjf1VdfDfhFIaVXr15LbL/66qvl7mKzRKPX2rTx9bTU+nz66aeB/CxOdSkDr/XPP/9csn4aybJo0SIADj/88GifjgR1pLjDDjsAudfrtttuC/hRyznnnBO1ffnllyXpm1mchmEYMWkWFqdaMmuvvTYAgwYNitr06VSXuXPnRtszZ84EvHP8b7/9VpZ+NldatGgB+PllgFVXXRWADz/8EPDWZGO5FdZYY41oe8cddwTg/fczRRk/+OADABYvXlyqbhtlRrWfP39+tO/FF18EoHPnTHa84cOHA15vgNNPPx2AI488com2vn37AvD2228X1TezOA3DMGJiN07DMIyYNDlUF5HhwD7APOdcj+y+tsA9QBfgY+AQ59xX5etm/uhwWiePwU8Sn3DCCQCsttpqUVurVq1y/n7hwkyBvPfeey/a99133wG1NUSvJl2XXXZZwEdvgR9S//jjj0DjOQRUw7POOivad8QRRwBw4YUXAvDJJ5nyMbrgUMtUk7YNkc+0l0aLffHFF0u0hZFC4GPYwUcTXX755QBssMEGUZsuIk2bNg2A77//PnbfIT+LcwSwZ5195wHPOue6As9m3xvpYgSma60yAtO2rDRpcTrnxmULvYfsB/TObo8EngfOLWG/YtOxY0cA9twz83sJF32OP/74vD/n66+/BuDxxx+P9n377bel6GJVUU26brbZZkDu018X5DRDTmOLQnvssUfOK8BHH30EeMtCLdfmQDVpW5cePXoA/nrV/ATvvvtudIy6kOko4Zdffmnyc8NcBvo7evbZZ4Hce8H+++8PwL333ptzbFwKXVVf3Tk3B8A5N0dE2jd0oJUbTRWma+2Sl7ama36U3R3JOTcMGAaFp+JXZ2ad9+rUqVPUps7sRx99NOAdYrt27brE53z++eeAd38BPz+28sorA36OM3RHyueJ19woha7KdtttB+Tqqo7vOr/V2FzYpptuCsDqq68e7bvzzjsBGD9+PGAa5kspdVXCEOe9994bgAEDBgB+zjrMjDVq1CgAnnzySQA+/vjjqE3dytQ9TUMw9foF7zB/7LHHAv7+AfC///0P8CPLQil0VX2uiHQAyL7OK6oXRrVgutYupm0JKdTifAQ4GhiSfX24ZD3K0rp162i7W7duABxzzDEAtG3bNmpTC1MTAqy44opLfNYrr7wCwJgxYwB44glf4l1XXXfbbTfAz6Vpns5mRtl1DdFVdB01hJqrhZnP/LLOhYXzoDo3qq+q50YbbRQdo6ut6jVx1113xf9PpIdEtQ0Js7TPm5d7v9a5yVD7fv36AdC7d28gN7/mO++8A3ineNUwXDnXz9LfzjPPPBO1XX/99UDjc+b50KTFKSJ3Aa8A3URkpogcT+bL7yMi04A+2fdGijBdaxfTtvzks6rev4Gm35e4L0aCmK61i2lbfqouVl2doXUSGeC6664D/HC8viGdup9oDKrGsIKPeX3ttdcAP6EMSy4aaF6/MItKPnHSRnxUO12QC9HFvV122QXwbiPhsaqHupuEixA69bL00pmfuC4shlmw1Bn+xhtvLPa/YjRCeI3dfffdgF+022abbQCfmwD8ta+LfZtssknUtvHGGwPeZUndzcJ4dl3Y1cWlW265JWorVRFGC7k0DMOISdVZnJpv7+KLL472aYikPrnCp8YLL7wAwIQJEwD417/+BeROKNd1cl1zzTWjbX3S6WeqhRK6N5ilWR7UetQFg1DXDTfcEIDzzz8fqH+xSEcemilHRyvgLRl1uFaLZMqUKdExDz74IOAtE6P86EKRZq3S11A7LZSoow39DYC/P+jocfDgwUBuKK0GT+jvKXQtLBVmcRqGYcSkaixOdRfR8p/hfJXOY6hV+fLLL0dtGqo1bty4vM/Vvr0PmlhnnXUA7xSvyUGaU4hepVAnZLUwdtppp6hNNVKrQS3GcPSwxRZbAN6KCa0WnZcePXo0AC+99BKQ+9uZOnUqYDk646DhsfqdhclwiiG83nRbrcpwlKAW52233Qb4EWfSmf7N4jQMw4iJ3TgNwzBiUjVDdR1arbfeeoAvGRu23XTTTQBMnDgxaouTuUgXfsLYVc0LqNEjWlJU49qN8qPx6Lvvvnu0T8ud6NBQI0zCobou2ulw/vbbb4/aNt98cwAeffRRAO6//37A62zkj1434KfUtMyJuv3pdBoU7/Kj17suGmomJfB6qttgpYrxmcVpGIYRk6qxOPUppU+X8CmnCzfqRqQFmwpFi7aBd4ZXdxddhDAXpOSo77v+6qtMcnLNL9AYOmrQOGbwjtLqjqSfYxZnfMLFMx0JHHXUUYD/7q+99troGA0eKfS7VvdDzXIUWpxz5swBfK7NSmEWp2EYRkyqxuJUVxK1LvVJBn5+Sy2TQmv/6JMzDLms61itT7swP2B9IYFG9aAahtap5vjUeVPN4zhixIjomFKF3zUn9HvUDGOnnnoq4L9vgGuuuQbweTTD7O6KzlWr+5/OSQP07NkTgL/85S9Abu5MDZ+stHZmcRqGYcSkaixODZnSOcawhsiVV14JwNixY4FcazSO9am5OsNcm/rkUod7TQ4xefLk6Jg77rgDKLw+iZEMmjgC4LnnngN8FvC+ffsC3msCyhOKV+uoFamr6ZpgJbQYNZGHZvFfsGBB1KZBB3q9achzWC9Kr1O9Bh966KGoLZ857yTIJx9nZxEZIyKTRWSSiAzI7m8rIk+LyLTsa5vyd9coFaZrbWK6JkM+Q/XFwNnOuQ2BXsBpItIdKzeadkzX2sR0TYB8EhnPAbQ63iIRmQx0pEzlRrWUhcalA7z66quAd3Mo1FVIF6DCxaGwcBt4N5ZwcSicGqgVktY1CaZPnx5t61BSsyTpUPK0006LjrnooouAwhcbq5GkdFWXQI0d10Ui8AtFdV3CwJfvVid3zXAVxqOrjjfccAMA//3vfwvtZtmINceZrdW8OTAeKzdaM5iutYnpWj7yvnGKSGtgFHCGc26hPjGaotByo+EkfqkIJ6kVDdlSa1RdltTRFuKFdaaNpHUtJ6GjtoblquU5dOhQwLu4gF9o0Nfw/572AIikdNUF06uuuirapxnbNdOZvoJf9NUgFP37cGFPF4iLLeFbTvIag4pISzIi3OGceyC728qNphzTtTYxXctPkxanZB5VtwKTnXNDg6aKlRstFJ3PDOuT1H0SaylhTTwRHpN2KySklnStD7U+1VG7V69eAOyzzz7RMZpZfNdddwVg9uzZUdudd94J5FYSSAPVoKuGvk6aNGmJNr0GdaSnroGh+2EayGeovj1wJPCuiLyV3fc3MgLcmy09+inQtzxdNMqE6VqbmK4JkM+q+otAQxMkVm40pZiutYnpmgxVEzmUBJpxKYw9b9WqFeCjkh5+ODOCCTMw1dIQvbmhpaEvueQSAGbMmBG1HXTQQQD069cPyC0KZqVTiqe+ePK6+9I2RFdqz0HRMAyjzEiS1lSl3VZ0kUeddsHn+tP8j3GKvhXAG865rcp5gkpQaV3jELrGqEuMLhaqdQqxRxmma23SoK5mcRqGYcSkWVmcShhmqVZoQiVizTKpTUzX2sQsTsMwjFLRrFbVlUpnjzYMI92YxWkYhhETu3EahmHExG6chmEYMbEbp2EYRkySXhyaD3yXfU0b7Si+32uXoiNViOlam5iuDZCoHyeAiLyeRp+3tPY7KdL6/aS130mR1u+n3P22obphGEZM7MZpGIYRk0rcOIdV4JylIK39Toq0fj9p7XdSpPX7KWu/E5/jNAzDSDs2VDcMw4iJ3TgNwzBiktiNU0T2FJEpIjJdRM5L6rxxEZHOIjJGRCaLyCQRGZDd31ZEnhaRadnXNpXua7WQBm1N1/iYro2cN4k5ThFpAUwF+gAzgQlAf+fc+2U/eUyyNac7OOcmisiKwBvA/sAxwALn3JDsj6iNc+7cCna1KkiLtqZrPEzXxknK4uwJTHfOzXDO/QzcDeyX0Llj4Zyb45ybmN1eBEwGOpLp78jsYSPJiGOkRFvTNTamayMUdeOMYcp3BD4L3s/M7qtqRKQLsDkwHljdOTcHMmIB7SvXs/ISc4iWOm2bq65Q29dskroWfOPMmvLXA3sB3YH+ItK9ocPr2VfVflAi0hoYBZzhnFvY1PG1QkxdIWXaNlddobav2cR1dc4V9A/YFhgdvB8IDGzsWDJffHP+90Wh33dS/+LoGhxf6e+10v+qXtcCr9lKf6+V/tegrsVkR6rPlN+m7kEichJwErBxEeeqFT6pdAfyIK6uRjp0hTy0NV1zaFDXYuY48zLlnXPDXCZLyQFFnMtIjli6uhRmzmnGNKmt6Zofxdw4ZwKdg/edgNkNHeyce6KIcxnJEUtXI1WYtiWimBvnBKCriKwjIq2AQ4FHStMto4KYrrWLaVsiCp7jdM4tFpE/k1n0aQEMd85NKlnPjIpgutYutaCtSO5sg6tQkqJEsyOJSGX+l9XDG7U4d2S6mq5JkfCNs0Fdk645ZBgVJbzwWrZsCUC7du0AmD3bpvuqiVVXXRWA4cOHR/v22msvAAYNGgTAFVdckXzHsOxIhmEYsTGL02hWbLTRRtH2tttuC0CPHj0A6NjRRxT269cPgF9//TXB3hkAyy67LABnn302APvuu2/UtnjxYgC6deuWfMcCzOI0DMOISc1ZnF26dAHgyy+/XKLtu+++A+C3335LsktGFdCzZ08Ajj766Ghf//79AVh++eUBaNWqVdR26aWXAnD++ecD3tIxykOLFi2ibbUwBw4cuMRxP/zwAwDjxo1LpmMNYBanYRhGTOzGaRiGEZNUDtV18hhggw02yHndfvvtAVhxxRWjYz766CMAPv74YwBeeumlqG369Oll7auRHKGrkW4feeSRgHdj0UWfkEceyQTPfP3119E+dYHR4bsN1cuD6rTVVt5d8rbbbss55pdffom2r776agBGjhxJJTGL0zAMIyapsjj16b/DDjtE+9Si2GWXXQBYbbXVgNwFILVQP//8cwAGDBgQtX3ySSZzVPhUM9KFOrJ36NAh2qe/i759+wKwySabALmW47///W8Abr31VgDee++9qE0tIbM0y8syyywDeNcj8It16gqmOgEMGTIkwd41jFmchmEYMUmFxalPpcMOOwyAs846K2pT96OZM2cCMGHCBACWWso/E9TRec011wRgt912i9reeustwHVxGKoAAAyASURBVOY604L+FgB+/vlnwLsY7brrrlHb/vtnanPpaGPSpEwui3vuuSc65vbbbwdg1qxZgDm7J4lenzoiOPjgg6M2jT+fMmUKAP/3f/8XtX3//fdJdbFRzOI0DMOIid04DcMwYtLkUF1EhgP7APOccz2y+9oC9wBdgI+BQ5xzX5WyY6FrySqrrALAn/70JwC6du0atenQfOjQoQA8+OCDAPTq1Ss65tRTTwXg8MMPB2CzzTaL2pprFFGldC0UnZLRRR6Agw46CPCuLOqSBn7YPXr0aACee+45AK688sroGP2NVSqnY7lIg7Y65bLffplS7eH1/tNPPwFw2mmnAfDNN98k3LumycfiHAHsWWffecCzzrmuwLPZ90a6GIHpWquMwLQtK01anM65cdlC7yH7Ab2z2yOB54FzS9gv1lhjjWj73HMzH73eeusB8MUXX0RtTz/9NABPPfVUzt/rhD/AxhtnCmyqa0noAD9v3rxSdjs1VErXfFELRDMWXXLJJQBst9120TFrrbVWzrGaiwB8jLnGNOsiYEitWZpKNWurMemq54EHHrjEMbrQ+/zzzzf4OXUTGockoWuhq+qrO+fmADjn5ohI+4YOtHKjqcJ0rV3y0tZ0zY+yuyM554YBwyBeKv4wZLJ79+6Ad2EI3RNGjBih58n5+y222CLabtu2LQALFiwA4LPPfGnpanFvSBuF6toYrVu3jrY7deoE+Aw5O+20U87+EA1eeOIJX0j1uuuu036WomvNhnLoGqKWoq5T6Pvwmtxnn31y/kYDHMAHuNQNndVwaoCHHnoIKK/2ha6qzxWRDgDZ1+Y53q09TNfaxbQtIYVanI8ARwNDsq8Pl6xHWZZe2ndNa8LMmTMHyH061X2qrLzyykBu0gB9Smkyh/fff7/U3W2UFK3ell3X+tAQu9Ca1CQOOj+93HLLNfj3apHoyjvA3//+dwBGjRoFwFdfZRaQNcS2GVIRbeuiYbF77713zv4PP/ww2tZkK7q2cc4550RtOnqsS+gds/baawN+rrQcNGlxishdwCtANxGZKSLHk/ny+4jINKBP9r2RIkzX2sW0LT/5rKr3b6Dp9yXui5EgpmvtYtqWn6qNVdfhG8C0adMAP6EcuiMpOrQ/5phjADj++OOjtrfffhvwufzGjx8ftZVq+KyZm4499tglPledry0ePhd1gtbhmy7ogC91oc7Q6gQdllhYuHAh4Kdyttxyy6hNneE1Zv3NN98EYNiwYdExL7/8MtB8gyAqwQEHHAAs6U4UTq3p9anuZvkQ5qZQVzQNfCnHFJmFXBqGYcSkai3O0GFZi86rZVKf5aaZjzTUMswS/8ADDwAwefJkoPgnULhQoZbRySefDPgQMnV9Avjd734H5E5yG16jiy++GMjNWqWW5muvvQb4AIewCJ9O/uv3q78B8IsPuk9/O9tss010zA033ADA9ddfD1juzSSorwAb5Lqihdt10Zy68+fPB5Z0VQQ/6rzpppuA+oMfisUsTsMwjJhUrcUZPv3feOMNwM9jhm06F7rjjjsC/gmkVgTAHXfcAfg5sULRecxNN9002nf66acDPhek5ogMz6UlTY1cNHwyzOiv6Dy2Zv++6667ljhG5zt1jrJNmzZRm+qv+my99daA/52An/NW61YtFEiF61gqCZ3ZG0K/e82dev/990dtr776as7nqGthOArU+dPQpbHUmMVpGIYRE7txGoZhxKRqh+r1Ud/kvQ7TdDFA45bHjh0bHTN79uyizqsZejSfpy4EgY+p//TTTwHv9vLwwz4wQ2NnjVx+/PHHnNcQHa6ttNJKDf593VIX4YKcbl9wwQUA/PGPfwT8dAt49yUtKX3fffdFbeEilFEcYex5OJ0CfpEnXDjV60WjverjlFNOAXIXgRW9FnWKrxyYxWkYhhGTVFmcSjjp27t3bwC6desGeFelsNRrHHTh54gjjoj2aSEptX7Cp9yTTz4JwLXXXgv4/I9G04wZMwaARx99FIAzzzwzauvcuTMAe+yxB+DzpuqxkJ/7kGa/0hFAqKu6sOjCUZiRyyzO0hGOBHTxVK+h2267DfC5CaDxgATNkqUubPXlgTjqqKOW2FdqzOI0DMOISSotzrBErDqcr7/++oB/gmmGFfCWRX1PMnW61vnL3/8+E84bzq3pts6bhnkf1QIKs7sY8dByz2qNgM+MoyF6qkGYAf7yyy8HvEXTmKWi85hqyYK3Vl588UUgN4O8UTpeeeWVaFvd9NTiDK/lhghdjTRz/KqrrppzTOjk/sEHHxTe2Twxi9MwDCMm+VS57AzcDqwB/AYMc85dU8mqeaFDueZw1DkwdYwNHZ11pV3rGKl1CT5TvFo0uur6+OOPR8foPOY777wD5OZ01BCwtFGNup53nq8fpiuqQ4Zksp/pSKBHjx7RMVqtdNCgQYDP1wreCtXkEYcccgjgczUCzJgxA/DWSn3JY9JGNeoazjXWHRUcd9xxQG71UQ2l1TnnO++8M2rTkYMyd+5cAPbdd99oXzjaLBf5WJyLgbOdcxsCvYDTRKQ7VjUv7ZiutYnpmgBN3jidc3OccxOz24uAyUBHMlXzdClsJLB/uTpplB7TtTYxXZNB4izZZ0uOjgN6AJ8651YJ2r5yzrVp4E/1mJL4B4SF2NSM1ww5usCgjujgczMq4WSzDvunTp0KwDPPPAP4kgvgh+aNOeTmyRvOua2aPixZqkXX+tBgg8GDBwO5Duw6LfPtt98CuYsQuk+d3HWaJsy8o+Vn1Tk+XJyKiemaJ5onV3MJ6OJQOM2ii3TaFi7oKVOmTAF87s0HH3wwaithftUGdc17VV1EWgOjgDOccwsbq2tc5++s3GgVY7rWJqZrecnL4hSRlsBjwGjn3NDsvilA72yN5g7A8865bk18TsmfYK+//jrgi3qpRRJmJ1LXBw3vCou9XXXVVYC3PCdMmAD4xaYSU1WWSTXrWhcNdNBFonCfBj9oJvgQtT5mzZoF+AU+gL/+9a+AD9ErolS06Zon6hqoTupqeYYVHxpD71ea3f3mm28Glgy/LREN6ppPsTYBbgUmqwhZtGoeVLBqnlEYpmttYromQ5MWp4jsALwAvEvGvQHgb8B44F5gLeBToK9zbkG9H+I/q+RPMK0pUzenY/j/UotC3RzCeTKdy0wo+3fVWCbVrmtDhI7PqqMGQeg8JvjQWR15aPb/4cOHR8eo20oRc5uK6RoTtTz79esHwIUXXhi1aTCLTi/ofCbAZZddBvg5zWJz7DZB4XOczrkXgYYmSKxqXkoxXWsT0zUZLHLIMAwjJrHckYo+WRlNf40y0MxJoeuQti1atKhcp8+XqhnSlZIkh+qNEeYX0AVBdUvSMhtl+g2YroWfI+cV/DRbFZQvKXxxyDAMw8glldmR6qMxS6IKLE0jAcKFgjIvGhglooqsy1iYxWkYhhETu3EahmHExG6chmEYMbEbp2EYRkzsxmkYhhETu3EahmHExG6chmEYMbEbp2EYRkySdoCfD3yXfU0b7Si+32s3fUgqMV1rE9O1ARKNVQcQkdfTGNeb1n4nRVq/n7T2OynS+v2Uu982VDcMw4iJ3TgNwzBiUokb57AKnLMUpLXfSZHW7yet/U6KtH4/Ze134nOchmEYaceG6oZhGDFJ7MYpInuKyBQRmS4i5yV13riISGcRGSMik0VkkogMyO5vKyJPi8i07GubSve1WkiDtqZrfEzXRs6bxFBdRFoAU4E+wExgAtDfOfd+2U8ek2zN6Q7OuYkisiLwBrA/cAywwDk3JPsjauOcO7eCXa0K0qKt6RoP07VxkrI4ewLTnXMznHM/A3cD+yV07lg45+Y45yZmtxcBk4GOZPo7MnvYSDLiGCnR1nSNjenaCEndODsCnwXvZ2b3VTUi0gXYnExN6tWdc3MgIxbQvnI9qypSp63pmhemayMkdeOsr85zVS/ni0hrYBRwhnPOCtg0TKq0NV3zxnRthKRunDOBzsH7TsDshM4dGxFpSUaEO5xzD2R3z83Op+i8yrxK9a/KSI22pmssTNdGSOrGOQHoKiLriEgr4FDgkYTOHQvJFHi+FZjsnBsaND0CHJ3dPhp4OOm+VSmp0NZ0jY3p2th5k3KAF5E/AFcDLYDhzrlLEjlxTERkB+AF4F3gt+zuv5GZN7kXWAv4FOjrnFtQkU5WGWnQ1nSNj+nayHktcsgwDCMeFjlkGIYRE7txGoZhxMRunIZhGDGxG6dhGEZM7MZpGIYRE7txGoZhxMRunIZhGDGxG6dhGEZM/h+EXvniD/G29wAAAABJRU5ErkJggg==\n",
|
| 60 |
-
"text/plain": [
|
| 61 |
-
"<Figure size 432x288 with 9 Axes>"
|
| 62 |
-
]
|
| 63 |
-
},
|
| 64 |
-
"metadata": {
|
| 65 |
-
"needs_background": "light"
|
| 66 |
-
},
|
| 67 |
-
"output_type": "display_data"
|
| 68 |
-
}
|
| 69 |
-
],
|
| 70 |
-
"source": [
|
| 71 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 72 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 73 |
-
"from matplotlib import pyplot\n",
|
| 74 |
-
"from tensorflow.keras import backend as K\n",
|
| 75 |
-
"\n",
|
| 76 |
-
"# Load data\n",
|
| 77 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 78 |
-
"\n",
|
| 79 |
-
"# Reshape our data to be in the forma [samples, width, height, color_depth]\n",
|
| 80 |
-
"x_train = x_train.reshape(x_train.shape[0], 28, 28, 1)\n",
|
| 81 |
-
"x_test = x_test.reshape(x_test.shape[0], 28, 28, 1)\n",
|
| 82 |
-
"\n",
|
| 83 |
-
"# Change datatype to float32\n",
|
| 84 |
-
"x_train = x_train.astype('float32')\n",
|
| 85 |
-
"x_test = x_test.astype('float32')\n",
|
| 86 |
-
"\n",
|
| 87 |
-
"# Create our image generator\n",
|
| 88 |
-
"# Define random rotation parameter to be 60 degrees\n",
|
| 89 |
-
"train_datagen = ImageDataGenerator(rotation_range=60)\n",
|
| 90 |
-
"\n",
|
| 91 |
-
"# fit parameters from data\n",
|
| 92 |
-
"train_datagen.fit(x_train)\n",
|
| 93 |
-
"\n",
|
| 94 |
-
"# configure batch size and retrieve one batch of images\n",
|
| 95 |
-
"for x_batch, y_batch in train_datagen.flow(x_train, y_train, batch_size=9):\n",
|
| 96 |
-
" # create a grid of 3x3 images\n",
|
| 97 |
-
" for i in range(0, 9):\n",
|
| 98 |
-
" pyplot.subplot(330 + 1 + i)\n",
|
| 99 |
-
" pyplot.imshow(x_batch[i].reshape(28, 28), cmap=pyplot.get_cmap('gray'))# show the plot\n",
|
| 100 |
-
" pyplot.show()\n",
|
| 101 |
-
" break"
|
| 102 |
-
]
|
| 103 |
-
},
|
| 104 |
-
{
|
| 105 |
-
"cell_type": "markdown",
|
| 106 |
-
"metadata": {},
|
| 107 |
-
"source": [
|
| 108 |
-
"### Randon Shearing and zooming"
|
| 109 |
-
]
|
| 110 |
-
},
|
| 111 |
-
{
|
| 112 |
-
"cell_type": "code",
|
| 113 |
-
"execution_count": 3,
|
| 114 |
-
"metadata": {},
|
| 115 |
-
"outputs": [
|
| 116 |
-
{
|
| 117 |
-
"data": {
|
| 118 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAU4AAAD7CAYAAAAFI30bAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO2de7SU1Xn/PzsIiIIIKoiAgIoXVJSICCLeUSRaTAxGYxVjWpNlTLV1JRr/aNKuumLSVf0lrU210YCp8W4EL4SiwShKEMUrokC8oggiGoiXGOv+/THzffeew7nMe87MO+875/msxZo5797nvJt5ZvY8+7k67z2GYRhG9Xyu0QswDMMoGrZxGoZhpMQ2TsMwjJTYxmkYhpES2zgNwzBSYhunYRhGSrq0cTrnpjnnXnLOrXHOXVarRRmNxeTavJhsa4PrbBync64HsAqYCqwFlgFneu9fqN3yjKwxuTYvJtvasU0XfncCsMZ7/zKAc+4WYAbQphCcc9092n6j936XRi+iAwojV+ccAL179wZgl13CS7vDDjsAsO2221b8zubNm5PnmzZtAuC9994D4LPPPuvsUoogV0gpW/u8ti3XrmycQ4E3op/XAod14e91B15r9AKqoDBy7dWrFwAjR44E4IILLkjGjj/+eAD23XdfIGyy//u//5vMueWWWwC4/fbbAfjTn/7U2aUUQa5QINnmhDbl2pWN07VybatvKOfc+cD5XbiPkS25lmuPHj2S54MGDQJg//33B2DcuHHJ2K677grAX/7yFwDef/99AF566aVkzqpVqyrmdAM6lK19XqujKxvnWmB49PMw4K2Wk7z31wHXgan+BcHk2rx0KFuTa3V0xau+DBjtnBvlnOsFnAHMq82yjAZicm1eTLY1otMap/f+U+fchcACoAdwg/d+Rc1WZjSEvMpVR/SddtopuXbCCScA8JWvfAWAffbZJxnr27cvAO+88w4AN910EwDz589P5qxYUfpvdZejel5lW0S6clTHe38/cH+N1mLkBJNr82KyrQ1d2jgNo97Icy5Nc8qUKcmYPOfjx48HoF+/fsnYJ598AsAbb5ScyIsWLQKClgnBYWTkkwkTJgBw8803J9fmzStZFi6//HIAPvroo+wXhqVcGoZhpKaQGufnPhf2ewU477zzzkDrgc+arwDnOF7v448/BmDLli0AfPDBBwD83//9XzKnu9jA8sh2220HwJ577gnAueeem4wdeuihAPTv3x8IsZoQtMmXX34ZgGeeeQaADRs21HfBRs0YO3YsAHvssUdybfDgwUClrBuBaZyGYRgpKZTGKc/qjjvumFzTt9KXvvQlAI4++mgAhg0blsyRxvnnP/8ZgBdffDEZe+utUhjbkiVLAHjyyScB2LhxYzJnzZo1AHz66ac1+p8Y1XLIIYcAcPbZZwOw3377JWPynOtEEGuTCxYsAGDu3LlAZaqlUQx02sgjpnEahmGkxDZOwzCMlBTqqC7D8Iknnphc+/73vw/AkCFDANhmm9J/6bHHHkvm3H9/KWxNwdAHHXRQMnbkkUcCMG3atIp7vfBCKBjzzW9+Ewi5zXIoGbVBphRVOYLg7Js0aRIAJ510ElBpplFJRMl12bJlydgDDzwAwCOPPALAhx9+WJe1G/VDDts8YhqnYRhGSgqhcR5wwAFACEU577zzkrE+ffoAwQmgEmFy8gC8++67QAhHuvvuu7f6/VNPPRWAr371qwAMHx5qIfzt3/4tAP/2b/8GwKuvvtrV/5IRISdA7ND7whe+AMCxxx4LwIABA4DK6khyBj3//PMAzJkzJxmT/KW1dKHWptEgFBoYF1tXwHtnC7DXCtM4DcMwUpJbjVM2S4BzzjkHCMUc4oD0O++8E4Brr70WCHbIOMi9pbbRmu3k1ltvBWDt2rUAfPnLX07GZF/Tfe+5555k7OmnnwZCFXEjPUOHDgXgmGOOSa4dd9xxQCjcITto/Do//vjjANx7771ACHKHEE5mmmZx0UkkTkZRCm2jk1JM4zQMw0iJbZyGYRgp6fCo7py7ATgZ2OC9P6B8bSBwKzASeBU43Xtf07OqQo8ARo8eDYRQFB2PAa6//nogOAjaCxVSfmsc0nLwwQcDodWCjobKjYbgKNLxfcyYMcnYr3/9ayBUcClKhkqj5BrTs2dPILzmf/VXf5WMKQ9dTiGFE+moBqHi0V133QVUVjuKj3fdjTzIthao7kRsbnn77beBxmfxVaNxzgamtbh2GfCg93408GD5Z6NYzMbk2qzMxmRbVzrUOL33DzvnRra4PAM4uvx8DvAQcGkN15WEIEHQBlVjUQ4cgGeffRYIrWEVrhIHPCuwWnnPn//855MxaZz6fdV9lMMCgmak3GgFZ0PQXuOKTUWgUXLVawlw2GGlBotyCkkWEGprSoN/7bVSw8GHH344maPamhZyVEmjZFtrdIKIKyEpfFDXGhWW1Fmv+mDv/ToA7/0659ygtiZa17xCYXJtXqqSrcm1OuoejtTZrnlxMLRsHa3ZKJWSp6pICmGIw5FUm1MVpeO/vf322wPw+uuvAyFtb+nSpckcaajSguIWs7KtdqEndyFJK1fJILZdH3XUUQBMnDhxqzGheprqh/773/8+GVu9ejUQ7Jl6n0DQTHRfvWdkM43n632l1E2AN998Ewg21e5iM81Tl0slrsQap059OuE1Si6dPV+ud84NASg/WnXY5sDk2ryYbGtIZzXOecAs4Mry49yaragdpE3G9s9LLy2ZaaR5SoNUuhaE1DzZyRYuXJiMyTv3xBNPAFvbTCHYSJX+FwfQSzNptJevRtRNrrIdx/ZlFWuJ5dkSBbn/5Cc/AeCPf/xjMqb6qupLFKfJyi6uRArV8Yzvv9deewFBo2mtMIy8uE2gcTbkM9sVFFkh3waEdOdGy6NDjdM5dzOwBNjHObfWOfd1Si/+VOfcamBq+WejQJhcmxeTbf2pxqt+ZhtDx9V4LUaGmFybF5Nt/cltrvqmTZuS53K8yCAc57EPHDgQCOFHylFeuXJlMkdHc+WYv/LKK8mYjmkKhVGVJOWnA4waNQoIR8M4AF+58UbryEmjJIapU6cmYzpaa058JFM9VIUcKUc9DvtSyNjee+8NwPTp05Ox3XffHQjOIJkKWgslEzoaQghtUmsVrSd2IBn1QaYXtcOJ94LnnnuuIWtqSbGCDw3DMHJAbjXO+fPnJ8+lrUhriJs4/eEPfwBCGJEqfsf1OBVqJOLwBqVWXnDBBQB87WtfAyqNz9Jmf/SjHwHwm9/8ZqsxoxK9xtIKx40bB8CUKVOSOZKngpjjkC45haTRK7FBmiTA/vvvD4RwphkzZiRjOpUo4L61QGlpr3qM02x1upBTSU5A0zjrjzR/yTBOo86LE9Y0TsMwjJTkVuOMizlcccUVAPz85z8Hgk0MQoiRKkPrG6m1cAVpQXEQ9OWXXw7AySefDLSu/ah/jTTNlhqssTXq/SQ71SmnnALAyJEjkzkKTpfsZE+EEOiu9FrZQ2fOnJnMOeKII4CgFcb2S8lRYWmyn8Y2UtnSFMIWjykETUHymmvUH713RNyqW+FhjcY0TsMwjJTYxmkYhpGS3B7VY5Qt0jIsCUIJ/faqpEj11zHxF7/4RTKmrBUd/xXGdMcddyRzbrzxRiAcG60KT8fImSMHjmqY6lgcoyP64sWLk2uqASAHwfHHHw+E4zkEp2H//v2ByqpZa9asAULomZw6Cl8D2HfffQE4/PDDgUoTkN5j+n8UrfpV0YgdtqqaJZPaDTfckIzFoUmNxN4NhmEYKSmExiltsrOhCLvtthsQHAtjx45NxlTxSOFLt912W8UjBGdQo1uSFglpEHLYxA45Ic1dVY7uu+++ZEyvucLDVJMgloFOG9Im48pJDz74IABLliwBQjhRnBevFsSq0yqHUPy3FQqTlzCYZiU+CZx++ulAeH+ou0J8rdGYxmkYhpGSQmicnSGuzSiN4uyzzwYqbVnSRFQN5/bbbwdCmJPRNaR56jFOq1RFKiUtxCFop512GhAqUknTVI1GCD2HlCwhuyYEe6c0RdkxTzjhhGSO/rYSKhQWBUHjleYa20+N2hOnUatqlUIK4/TpvGAap2EYRkqaVuOMq7wrJU9e1NiD9+ijjwLw29/+Fgj1/oz6ENuoFNis9Ma4z5M0wwMPPBAImmOc8qiCD7Jjxql5sqnK865Uz8mTJydz1F1TxBqrtGGl9Mb1XY3ao1MhhCrvkqveH3mimnqcw51zi5xzK51zK5xzF5WvD3TOLXTOrS4/bm39N3KLybU5MblmQzVH9U+BS7z3+wETgW8558Zg7UaLjsm1OTG5ZkA1hYzXAeqOt8U5txIYSk7bjaoaThxydOSRRwLhmBi3X1CtTtVb7C5kLVc5dxRQDjBixAig9eD28ePHA6GFiY7hca6yju/KY49z1dW8Tw3h5HyIHYNCiRVyUkGoSxA7s4pA0T6vIq6lqvfIiy++2KjldEgqG2e5V/M4YCnWbrRpMLk2JybX+lH1xumc6wvcCVzsvd8cO1jaI+t2o2oxq1bAEJxDCjeJA6UV+N5d6yzWW64tkwbi1EUlJihlMp6rWp3SPnSSUBM2CCcJpXXGFd332GMPIDgJFXKkwHYIoWhyDCr0CIKjqNFNwTpLUT6vSkCJnUNCNVnzSFXhSM65npSEcJP3/q7yZWs3WnBMrs2JybX+dKhxutJX1fXASu/9VdFQLtuNKnxFYSgQNBlVa1fRDui+gc31lqtec9km9RjXtWytD1BbqFBLHGam8CVpo3qMkYYp22hs35Zd++677wZg+fLlydj69es7XFMeKdrnVXZunRQhvPYKR8oj1RzVJwNnA88559Sl7HJKArit3Hr0dWBmG79v5BOTa3Nics2Aarzqi4G2DCTWbrSgmFybE5NrNjRN5pCM3zqiK0QlHlNoyUMPPZSMxWX5jdqh0C+1UtYRW9lbEEKDWrZKaA05lVoLJ5JZIM41l6zXrVsHhMpLcdsTPVejv/fff7/DdRi1RbJT+2cIjrm8tMloDctVNwzDSEnhNU5pIgp4V2vROO9Z32pWTzM7FMYjA7+C1ePEBIUjKcRIwe4Q8pWljUqDjXPGFUKmCvJxdSVVClfI0WOPPQZUnjDkKFLFpaKGHhUZVT669tprk2uSQ57rA5jGaRiGkZJCapyx/fKCCy4AQjWdUaNGAZX1ODdsKIWsqW5jHqutNBvSGlQz86mnngIqZTdu3DgAJk2aVPEIoT+UNE5prLJZAixduhQIvYqUzAChnmpsOzPyh04SagFeFEzjNAzDSEkhNc648IKCZWXTVNqdbFsQtB4FvisQ3qg/0jy3bNkCVFbWV3EN9RxasGBBMqZumEq5bPl3INg45X2NOyDm2T5mFB/TOA3DMFJiG6dhGEZKCnlUjwOV7733XiCEluiIF+ck6yioVgtxhRwjG2Reic0sctzELSsMowiYxmkYhpESl2VQuHPuHeADoIh5jjvT9XWP8N7v0vG0YmFyNbnmkLrKNdONE8A594T3fnymN60BRV13VhT19SnqurOiqK9PvddtR3XDMIyU2MZpGIaRkkZsnNc14J61oKjrzoqivj5FXXdWFPX1qeu6M7dxGoZhFB07qhuGYaTENk7DMIyUZLZxOuemOedecs6tcc5dltV90+KcG+6cW+ScW+mcW+Gcu6h8faBzbqFzbnX5cUCj15oXiiBbk2t6TK7t3DcLG6dzrgewCpgKrAWWAWd671+o+81TUu45PcR7v9w51w94EjgVOBfY5L2/svwmGuC9v7SBS80FRZGtyTUdJtf2yUrjnACs8d6/7L3/BLgFmJHRvVPhvV/nvV9efr4FWAkMpbTeOeVpcygJxyiIbE2uqTG5tkOXNs4UqvxQ4I3o57Xla7nGOTcSGAcsBQZ779dBSVjAoMatrL6kPKIVTrbdVa7Q3J/ZLOXa6Y2zrMpfA5wEjAHOdM6NaWt6K9dyHQflnOsL3Alc7L3f3Oj1ZEVKuULBZNtd5QrN/ZnNWq6dtnE65yYBP/Den1j++XsA3vsftjUXOKHTK20ONua9GEQauUbzH8tuhbkk93KFTn1m25Src5X7qir1A/Ts2ROAXr16AbDtttt2+PvqVhv/vojLQKoTgHoVffTRR0BlVweN1YA25dqVepytqfKHtZzknDsfOB84sAv3ahZe63hKw0krV6MYcoUqZNuRXLXBaaNUM73+/fsncwYNKp2K1XBv9OjRyZh+Txunfj/eXAcPHlwxV+2fIdTi1Uap9sJxoz5tpmrw1wXalGtXNs6qVHnv/XXAdc656cB9XbhfKuJvQBU3lpDibzD1pqnht1TRSSVXAOdcbo9wRgUdyjaW67bbbutHjBjBwIEDk3FtcP369QNg5513BmCnnXZK5mgTHTJkSMUjVH4u45+lnca/r0067lSqDVOdaseMKVkaDjww6GUqWK7HuP/Uxx9/vPUr0Am64hxaCwyPfh4GvNXGXLz393fhXkZ2pJKrUShMtjWiKxvnMmC0c26Uc64XcAYwrzbLMhqIybV5MdnWiE4f1b33nzrnLgQWAD2AG7z3K2q2si4yYsSI5PmFF14IwKRJkwB49tlnk7Ef/rBkF3/11VezW1yOybtcjc6TVrbbb789EyZMYP/990+uDRhQSsDZbbfdANh3330BKo7zLe2feoStnUL6OXYOtZzTngNbLabjPmQ333wzENpHr1u3Lhmr1VG9S83aysdvO4I3GSbX5sVkWxsK2eWyPfSN+O///u/JtfHjSxX0ZdDec889k7FddilFG9x4440AzJ8/HwjGZyNfxKEqw4eXzHXHHnssADNmhMQWaTlyJrzwQilTcO7cucmcJ554or6LLTh9+vRhzJgxjBs3Lrkm55AcOPpMtQwhguBwjTubthyTphh3pd2yZUvF3xw6NMTd77DDDkDQUCVnOakAJk6cCITQpUceeSQZW758ecU9Ovs5t+pIhmEYKWkajbNPnz4AXHDBBQAcfvjhydjixYsBWLZsGQCHHHJIMqZvU2meigFbsGBBnVdspEEhZUcccURy7bvf/S4Ae+21FwDDhg1LxmQnk9Zx1FFHASG2EGD27NkALFmyBKgMojagd+/e7L333uy3337JNcVGSmPUa6bPTUfIXim5vPzyy0Clj0Fxm9tttx0AkydPTsYkY9lRpfHuuOOOyZxDDz0UCLKOtUr9bf0/TOM0DMPIiKbROPXNc8IJpazO2DP3H//xH0DwpkvDADjttNMA+OpXvwrA9OnTAdM488LBBx8MwNlnnw3AKaeckoyNGjUKCPaujRtDG23ZzGQfU4B2bAeVR3bVqlWAaZwt2bJlCw8++GCSnQPhNVISiTTI1hJIWvOGt9Q43333XaDSK97Sxin7NAQ59u3bFwja5THHHJPMkd1T/o4DDjggGXv77bcBeOihh7a6bxpM4zQMw0iJbZyGYRgpaZqjusIU5OSJj10tj91xQKyMxXIexMG+RuPYe++9AZg1axYA5557LhCcRACvvPIKAFdffTVAxZFSx3aZYs466ywgOJIgHOHkIHzjjbj+hfHee+9x1113VRy5N28uVWyrVSB5WnREV/iRQp1k0onHtCfovQQhKD5OgukMpnEahmGkpGk0ToUu9O7dG6hM4dK11kIPVqwoZZzFWqjReI477jggOPtUIUchRADXXHMNEAz+sRakcBM5ChSeFpc4UwiLypgZlXz22WeJo0bElcUagT7DmzZtAoLmeN99ofCaAuDHjh1bt3WYxmkYhpGSptE4ZavcsGEDAPvss08ypgDeZ555BqgMk5BGIq3UQlLygWT105/+FIA333wTgMceC0XJpXW0h0KOZAtrLUSmZY1Io8Rnn32Wu8+DNF49rl69GoDf/e53yRydIEzjNAzDyBG2cRqGYaSkw6O6c+4G4GRgg/f+gPK1gcCtwEjgVeB07/17bf2NLFCmyC233ALAlVdemYxdd911AJx33nkAPP/888mYKifJaaDqSM1O3uWqykVPP/00EJw9rVXaSUOcUabQlDVr1nTpb+aNvMu2luj9ELfXqDZvvitUo3HOBqa1uHYZ8KD3fjTwYPlno1jMxuTarMzGZFtXOtQ4vfcPlxu9x8wAji4/nwM8BFxaw3WlRsbiO+64A6jsmnfJJZcAcPHFFwPwve99Lxk7+uijgRAs29XA2KKQd7lKk+iqhtmS2DmkPOlmq/6fd9nWktYqyMfPobJZmwL4u/q+6qxXfbD3fh2A936dc25QWxOtjWyhMLk2L1XJ1uRaHXUPR8q6jazCVn71q18l19QXRcHUcb9l1eaUprlw4cJ6L7EpKEp7YKXQKhU3tnGqoo+1hg4URa5Ccv385z+fXFM/JFVgilNp1TK4s1WRRGe96uudc0MAyo8burQKIy+YXJsXk20N6azGOQ+YBVxZfpzb/vTskf0KgtdUNTe/8Y1vbDX/n//5n4HuY+Nsg9zLtVpU/VuRFNI85UmHUIdTyRNNTlPItqVNUwU9dKqE0A9J9szYhi2Za6yzdKhxOuduBpYA+zjn1jrnvk7pxZ/qnFsNTC3/bBQIk2vzYrKtP9V41c9sY+i4Gq/FyBCTa/Nisq0/TZOrLtVdpfUPO+ywZExl9ZWTHIektGxTqoo5XTUeG43l/PNLjmG1VlAdx1/84hfJnGuvvRZoXG1JIz36DCvcUO0x4hbCLRvB6RHg9ddfBypDlDqDpVwahmGkpGk0ToUlqBnXD37wg2RMWmTLyioQqucoKF7hTLfffnsyp7WKOkZjUKgJwBe+8AUgOAx02oDgCBw0qBSuKBnGjqC1a9fWd7FGzZGmKbnq/TBkyJBkjmp2qlJa7CjWSVJaaWcxjdMwDCMlhdc4pW185zvfAUKoUZx29T//8z9A+LaKe9MoIFYthJWeqSITUGkjMWqPZBhXYj/++OMBOPLII4EQ4KyEhc5y4oknJs+VZquajnfffXcyFheNMPJDyz5Ce+65J1CpcaojgNIq44QXaZpdPUWaxmkYhpES2zgNwzBSUvijutrGnnPOOUAIM/jud7+bzHn44YeBEMoQtwPQfOW2K9MkdkLYUb12xCYUmU4kO732EFr3ao6OVrFRv2UVnGry0HX0j58/9dRTQGUrjrlzC5lY0/T069cPgFGjRgEwbNgwoLJttLKCZIKJzS61cvSaxmkYhpGSQmqcPXv2TJ6fddZZQAgx+td//Vegsl1oHI7QFnfddRcAM2fOBMI3mdE1pAVKI5gyZUoy9stf/hIIYUSxNimDvir7S4ZxHUU14ZMWEWsTCjuSXAcOHAgEZwKE95EcgdI8jfyi95FqESgMMT59KMj90UcfBepTi8A0TsMwjJQUUuMcMWJE8lzhKYsXLwbgtttuA6rTMmNWrFgBhFClMWPGdHmd3ZmWIUbS5GPbc8sqNnFlqkceeQSAJUuWAME+ffrppydzVBFH2qk0DYBvf/vbFX9HiQ5qBw1B45SmW+tq80bX0HsolpnCkeSD0CnjhRdeSOaop5g+y/UILTON0zAMIyWF1DjlUYOgtehbRsGvaZGNZMuWLUClV13ffJZ6WT2qk6i0SCUWxMUYNm7cCMD3v/99AObNm7fV31FPqFmzZgEwderUZEwaorQNzYGgbcTptVAZDB0/N/KHPpNx/zB93pVyqZOEuqECPPnkkwC88sorQH3kXE09zuHOuUXOuZXOuRXOuYvK1wc65xY651aXHwfUfHVG3TC5Nicm12yo5qj+KXCJ934/YCLwLefcGKzdaNExuTYnJtcMqKaQ8TpA3fG2OOdWAkNpYLvR3r17b3VNzqC0jbd0HDjwwAMBmDx5MgA/+9nPkjnNeESvt1wVNrLXXnsBwaEXv5a//vWvgRAGJBkAnHHGGQBMm1ZqD66jv0wpEI7oavssBx90vfpNUcnj57Wz9OnTB4Dhw4cn1xSGtOuuuwKhmpmC3QHWr18PhCpJ9SCVjbPcq3kcsBRrN9o0mFybE5Nr/ah643TO9QXuBC723m+O09vao97tRrWOatfTEgVGy3mhlK7Y2NzM1EuuMsgr1EhG/Dh5QVXa9dja73/00UdAaLg3f/78ZM5VV10FVLZ/NUrk9fOahu222w4IpxYIVZFaapxxbdUsujdUFY7knOtJSQg3ee/vKl+2dqMFx+TanJhc60+HGqcrfVVdD6z03l8VDeWi3ahsZvomkgYZ29LiOo9Q2Ur0a1/7GgBHHXUUAAsWLABg2bJldVpxPqi3XGVnUrEMhY/svvvuyRw9V3B7rBXJZq2geMll0aJFyRwLWN+avH9e2yKWvcKPpFUedNBByZi0T32+pV3q/QbhdFNPqjmqTwbOBp5zzun8ejklAdxWbj36OjCzPks06oTJtTkxuWZANV71xUBbBhJrN1pQTK7Nick1GwqZORTnpcooPH36dCA4d+KjurKAFKqkfFcIWUivvfYaAP/1X/8FWHvgrqKMHcnq7//+7xu5HCPnyFwDoZWzsszGjRuXjKm6lcw0MunEGYNZHNUtV90wDCMlhdQ44/p6P/rRj4Cg0Zx00klAZX0+fYPp2ykOlr3nnnsAeOCBBwBYunQpkD6Q3jCMzqPQI4CxY8cCMHHiRKDSodiyPoEe4wpI9Qx8F6ZxGoZhpKSQGufHH3+cPFfr3169egEwevRooNJmIu1T1Xji9sAKd1Gb4O6aqmcYjaBlhwAINs3DDz8cqKyopdOmKiBJ44x9Ei0rYtUD0zgNwzBSUkiNM0Y9aa6++uoGr8QwjGqRpin/w5AhQ5IxJbMo4iWOkFF6rfpEqQNt1idF0zgNwzBSYhunYRhGSgp/VDcMo3joqD5gQKkQfVxzU0d1OYXi4HYlqsip26jKWKZxGoZhpMRlWd3cOfcO8AGwMbOb1o6d6fq6R3jvd6nFYvKEydXkmkPqKtdMN04A59wT3vvxmd60BhR13VlR1NenqOvOiqK+PvVetx3VDcMwUmIbp2EYRkoasXFe14B71oKirjsrivr6FHXdWVHU16eu687cxmkYhlF07KhuGIaREts4DcMwUpLZxumcm+ace8k5t8Y5d1lW902Lc264c26Rc26lc26Fc+6i8vWBzrmFzrnV5ccBjV5rXiiCbE2u6TG5tnPfLGyczm0Pkr4AABCdSURBVLkewCpgKrAWWAac6b1/od1fbADlntNDvPfLnXP9gCeBU4FzgU3e+yvLb6IB3vtLG7jUXFAU2Zpc02FybZ+sNM4JwBrv/cve+0+AW4AZGd07Fd77dd775eXnW4CVwFBK651TnjaHknCMgsjW5Joak2s7dGnjTKHKDwXibPy15Wu5xjk3EhgHLAUGe+/XQUlYwKDGray+pDyiFU623VWu0Nyf2Szl2umNs6zKXwOcBIwBznTOjWlreivXch0H5ZzrC9wJXOy939zo9WRFSrlCwWTbXeUKzf2ZzVyu3vtO/QMmAQuin78HfK+9uZRe+O78753Ovt5Z/Usj12h+o1/XRv/LvVw7+Zlt9Ova6H9tyrUr9ThbU+UPaznJOXc+cD5wYBfu1Sy81ugFVEFauRrFkCtUIVuTawVtyrUrG2dVqrz3/jrgOufcdOC+LtzPyIZUcgVwzm01buSSDmWbB7n27t0bCB1rx4wJ1oT9998fIOlD9Pvf/x6AhQsXJnOy6D/UFefQWmB49PMw4K22Jnvv7+/CvYzsSCVXo1CYbGtEVzbOZcBo59wo51wv4AxgXm2WZTQQk2vzYrKtEZ0+qnvvP3XOXUjJ6dMDuMF7v6JmKzMaQl7lquOb+tBA6E2jsY8++igZe/755wF45513APjLX/6SyTrzTJ5k26tXLwAGDSpFCQ0bNiwZGzlyJAATJkwAYOLEicnYpEmTKv7OvffeC8Cf/vSn5NqSJUuA+h7Zu9SsrXz8tiN4k2FybV5MtrWh8F0upW3suOOOAPTr1w+A/v37J3N22mknIHwDrV+/PhnbZpvSS/D+++9XjMXai5E9n/tcyYrUt29fAPbYYw8AZswIySsXXnghEOS7YcOGZOySSy4BYP78+QBs2rSpzis2qkHdLffbbz8Ajj76aABOOumkZM6UKVMA2G677Tr8eyeeeCIAPXv2TK7NnDkTgC1btnR9wW1g1ZEMwzBSUgiNU99S+laRlgihH7NsH2PHjgXgoIMOSuYce+yxAHz66adAsIsADBw4EIDHH38cgOuvvx6A1atXJ3OyCG8wKtEJ4otf/CIA5557LgCHHnpoMkd2Mr0/ZC+Ln+tEYuQDfYavvfZaIHxO4890jx49qv57a9euBcJnHILsTeM0DMPIEYXQOLfffnsgfDvJ6wYwefJkAE49tVT8RHYRaSMQgmVlNzv++OOTMV074IADAPjDH/4ABG8swLvvvluj/4nRHgMGhJKJ3/rWt4BgrxoyZAgAL7/8cjLnueeeA4IWc/LJJydjabQWIzv0edtzzz0B2HbbbducqxPi22+/nVx76qmnAFi0aBEATz75JBBOnlBp664XpnEahmGkxDZOwzCMlOT2qK4wFICvfOUrQDiO77777smYnDsyCMuRs3HjxmTOm2++CcBnn30GwLPPPpuMTZ06FQiB1bNmzQIqDcu/+c1vADuy1xvJAkJ4io5rN954IxBykwE+/vhjAE4//fSslmh0EcnznnvuAeC0004D4IMPPkjmvPjii0A4lusRwmd5zZo1QDjGy2zT8m/VC9M4DcMwUpJbjXPXXXdNnitI9sgjjwRCkDvAJ598AoSwBH3zLFu2LJkjh480ztjBoN8755xzgOAkOuGEE5I5K1aUstJM46wvccDzM888A8CqVauAEMgubQSCk/Db3/42EMKSjPwijVNhf0pMiFMmX3ih1NZo+fLlQNAu2+O9996r6To7wjROwzCMlORW49x5552T57JpStOU5gjw6quvAsEO+dvf/haotIW1F57w+uuvA7DvvvsCwbamnyHYP6UFKbzJqC0PP/xw8lxylK36ww8/BCrDV1QYQqFK0mYA/vjHPwJW3COvPProoxWPRcM0TsMwjJTYxmkYhpGSDo/qzrkbgJOBDd77A8rXBgK3AiOBV4HTvfeZWWfjykU60v3whz8EKisfVcO6desAeOSRR4BQA7BPnz7JHJkIlI305z//uTPLzhV5lGvstGsLZZwAfOMb3wBCOwU5+iAc++OwtO5CHmXbbFSjcc4GprW4dhnwoPd+NPBg+WejWMzG5NqszMZkW1c61Di99w+XG73HzACOLj+fAzwEXFrDdbVLrJmoqlGcW54GaZbjx48HQr50XGn8y1/+MgBvvFFqEFhUg3ZMHuXaHgpVUgMvCBWxdAL5x3/8x2Qszm/ubhRBtnL+KsSvaA7XznrVB3vv1wF479c55wa1NdHajRYKk2vzUpVsTa7VUfdwpFq2G1WKXZyCpSDZOEQpDfvssw8ABx5YavuuVM+4orTsarEW2t3Juo2sgt2POeaY5JrkIW0lrr1pwfCdo95yPeSQQwA45ZRTgOAvuPPOO5M5skvnuWp/Z73q651zQwDKj/Wv42Rkgcm1eTHZ1pDOapzzgFnAleXHuTVbUZnNmzcnz9UPSAU8VB0cYIcddujSfZTqpZqfsaYpNFZND5SCU3e5dhYVblDCA4RUPCUrqFo8hKD4l156CQidD+Pf72Y0TLax9q+eUSrco7G4nqbs00pciTswLFy4EGi8HDvUOJ1zNwNLgH2cc2udc1+n9OJPdc6tBqaWfzYKhMm1eTHZ1p9qvOpntjF0XI3XYmSIybV5MdnWn9zmqsf55ap0pEBnOXIgVDFS8HM9VHg5pZoh8L2ovPbaa0ClE0EJCarpKEcfhOZdOqr//Oc/Byob9b311luAtYLOksMPPxyAvffeu+J6HGYm85nMM3G9AbWC/s///E+g0qSXJZZyaRiGkZLcapxxqtwdd9wBhCb2X/rSl5IxGZn1zfPf//3fQNASWyMOWznzzNKpRt9kIg7IVVpmZ4PsjdqhalYAP/3pTwH43e9+B1Q24VN91T322AOAiy66CKhs9PfLX/4SCDVcrZJS/ZEWGdffhNDELX4ux29cEetf/uVfAFi6dCkQmrZljWmchmEYKcmtxhkj+5ZCEeJamWPHjgXgr//6ryt+J64AL/RNNn369OSanschTlD5jahvN1WmNvKBbJNqEfv8888nY0qL1elEdVb/5m/+Jpmz1157AXDllVdW/I5RW+LT269+9SugutObwv8UwhRf+8EPfgCE4jxxLdYsMI3TMAwjJbZxGoZhpKQQR3U5eh544AEgVDACOO+884CQT37++aX6BDNnzkzm6KigLIXBgwcnY7vssguwdcZQ7ITQEb071nYsAnLqxM6dJ554AgjtZNXuWc5AgBEjRgAhFMaO6vVH9XPVhqY9evToAYS2OADf+c53ABg3bhwA22xT2sLsqG4YhpFzCqFxSmNUwPL999+/1Ry1EB41ahQQQpcg1NyUcygOVYrDIGJi55By5dWK2Mg/0jAVZqZ6A9JQIDgdFY5k1Jb+/fsDoXEeBKdQmtA+fe4BDjroIAD+4R/+AQinBiU6ZIVpnIZhGCkphMYpVHNz1apVyTVphi+++CIQQpVkA4Fgw1IgrTQNCAHRSgGLg22N4rLrrrsCMG1aqYPEEUccAVRW2lELaYUzGbXl4IMPBkKVd6gMGauW994LrZFatuhWmq1pnIZhGDmnmi6Xw4EbgV2Bz4DrvPc/aWTXvDigVl5TPc6fPx8I9hUIVcOVwvXYY48lY7NmzQKCN767aJx5lGtXUTdSCJrmWWedBcDuu+8OwEMPPZTMWbx4cXaLy4g8yFXRK0pOiQt4/NM//RNQqYVW+/dg69ToRtXIrUbj/BS4xHu/HzAR+JZzbgzWNa/omFybE5NrBnS4cXrv13nvl5efbwFWAkMpdc2bU542Bzi1Xos0ao/JtTkxuWZDKudQueXoOGApKToiNoK4Tp+O71L540Dp2FnQXSmSXFtDyQuqwQmhVbCcfzqiqyIShDznZqVRcpUp7b777gMqEwvU7kSVrapB4YQQag3os6z6FVlT9cbpnOsL3Alc7L3fXG0XQWs3mm9Mrs2JybW+VLVxOud6UhLCTd77u8qX1zvnhpS/vdrsmpd1G9novsnz9iq3x/NiVIMTQgB8s1FEucbIkaewF9VqjMdU+f3WW28FQiomdL6ldN7Ji1zXr18PwOOPP55c++Y3vwmEJJS5c+dWzI1RssLf/d3fJddUX1W0V3e3nlTTrM0B1wMrvfdXRUPqmgc564hodIzJtTkxuWZDNRrnZOBs4Dnn3NPla5dT6pJ3W7mD3uvAzDZ+v5CsXr06eR5rn01EIeUah5lNmjQJCNXeBw0KZjvVfVSPomeffRZoXI+aDMmNXFUvNS7SoSr90jwnTpwIVKa9vvLKKwAccsghQGXHB50klP7cqFNDNV0uFwNtGUisa15BMbk2JybXbLDMIcMwjJQUKle9VsQVkaT6y+u4adMmAFauXJnMsTqcjWfIkCFAqIIFcOqppVBEVcKSAwjgpptuAmDFihWAtQBuBDpGxw3V1IZGtSQkO5ldINSSUOaRZA/Bmas2z1nX4RSmcRqGYaSkW2qccXvg3XbbDYBevXoBofqKDNTQuJCH7kbs3JGWofzzMWPGAKHpGoTaq+oMcPXVVydjquFoCQ6NR5XLAH72s58Boa33oYceCoS6mlDZjBEqHUAKoP/xj38MNK6ls2mchmEYKelWGqfsmHEr4ClTpgDQt29fIISrfPDBB8kc01qyQRWNIFStkp1LVcTjylZq63vLLbdktUSji9x+++1AqEz19a9/HYAvfvGLyZy4mhJUJrBcccUVQLCVNgrTOA3DMFLSrTROeeQ+/PDD5JrsL+p22VYKplF/TjzxxOS5OplKQ5k3bx5QmTJpnvLismFDKePzmmuuAWD27NnJmPwNIrZxpqnjWU9M4zQMw0iJbZyGYRgp6VZHdRE7fpSTPmHCBADWrl0LhPayRnbE4URqiaAjneoFxK1mjeIih6sascUN2YqAaZyGYRgpcVk6Q5xz7wAfAEXMYdyZrq97hPd+l1osJk+YXE2uOaSucs104wRwzj3hvR+f6U1rQFHXnRVFfX2Kuu6sKOrrU+9121HdMAwjJbZxGoZhpKQRG+d1DbhnLSjqurOiqK9PUdedFUV9feq67sxtnIZhGEXHjuqGYRgpyWzjdM5Nc8695Jxb45y7LKv7psU5N9w5t8g5t9I5t8I5d1H5+kDn3ELn3Ory44BGrzUvFEG2Jtf0mFzbuW8WR3XnXA9gFTAVWAssA8703r9Q95unpNxzeoj3frlzrh/wJHAqcC6wyXt/ZflNNMB7f2kDl5oLiiJbk2s6TK7tk5XGOQFY471/2Xv/CXALMCOje6fCe7/Oe7+8/HwLsBIYSmm9c8rT5lASjlEQ2ZpcU2NybYesNs6hwBvRz2vL13KNc24kMA5YCgz23q+DkrCAQW3/ZreicLI1uVaFybUdsto4W+vznGt3vnOuL3AncLH3fnOj15NjCiVbk2vVmFzbIauNcy0wPPp5GPBWRvdOjXOuJyUh3OS9v6t8eX3ZniK7yoZGrS9nFEa2JtdUmFzbIauNcxkw2jk3yjnXCzgDmJfRvVPhSo2JrgdWeu+viobmAbPKz2cBc7NeW04phGxNrqkxubZ336wC4J1z04H/B/QAbvDeX5HJjVPinDsCeAR4DlDN/ssp2U1uA3YHXgdmeu83NWSROaMIsjW5psfk2s59LXPIMAwjHZY5ZBiGkRLbOA3DMFJiG6dhGEZKbOM0DMNIiW2chmEYKbGN0zAMIyW2cRqGYaTENk7DMIyU/H+tbBGMNi7SGAAAAABJRU5ErkJggg==\n",
|
| 119 |
-
"text/plain": [
|
| 120 |
-
"<Figure size 432x288 with 9 Axes>"
|
| 121 |
-
]
|
| 122 |
-
},
|
| 123 |
-
"metadata": {
|
| 124 |
-
"needs_background": "light"
|
| 125 |
-
},
|
| 126 |
-
"output_type": "display_data"
|
| 127 |
-
}
|
| 128 |
-
],
|
| 129 |
-
"source": [
|
| 130 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 131 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 132 |
-
"from matplotlib import pyplot\n",
|
| 133 |
-
"from tensorflow.keras import backend as K\n",
|
| 134 |
-
"\n",
|
| 135 |
-
"# Load data\n",
|
| 136 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 137 |
-
"\n",
|
| 138 |
-
"# Reshape our data to be in the forma [samples, width, height, color_depth]\n",
|
| 139 |
-
"x_train = x_train.reshape(x_train.shape[0], 28, 28, 1)\n",
|
| 140 |
-
"x_test = x_test.reshape(x_test.shape[0], 28, 28, 1)\n",
|
| 141 |
-
"\n",
|
| 142 |
-
"# Change datatype to float32\n",
|
| 143 |
-
"x_train = x_train.astype('float32')\n",
|
| 144 |
-
"x_test = x_test.astype('float32')\n",
|
| 145 |
-
"\n",
|
| 146 |
-
"# Create our image generator\n",
|
| 147 |
-
"# Define shearing and zooming parameters to be 0.5 each\n",
|
| 148 |
-
"train_datagen = ImageDataGenerator(shear_range=0.5,\n",
|
| 149 |
-
" zoom_range=0.5)\n",
|
| 150 |
-
"\n",
|
| 151 |
-
"# fit parameters from data\n",
|
| 152 |
-
"train_datagen.fit(x_train)\n",
|
| 153 |
-
"\n",
|
| 154 |
-
"# configure batch size and retrieve one batch of images\n",
|
| 155 |
-
"for x_batch, y_batch in train_datagen.flow(x_train, y_train, batch_size=9):\n",
|
| 156 |
-
" # create a grid of 3x3 images\n",
|
| 157 |
-
" for i in range(0, 9):\n",
|
| 158 |
-
" pyplot.subplot(330 + 1 + i)\n",
|
| 159 |
-
" pyplot.imshow(x_batch[i].reshape(28, 28), cmap=pyplot.get_cmap('gray'))# show the plot\n",
|
| 160 |
-
" pyplot.show()\n",
|
| 161 |
-
" break"
|
| 162 |
-
]
|
| 163 |
-
},
|
| 164 |
-
{
|
| 165 |
-
"cell_type": "markdown",
|
| 166 |
-
"metadata": {},
|
| 167 |
-
"source": [
|
| 168 |
-
"### Horizontal and Vertical Flips"
|
| 169 |
-
]
|
| 170 |
-
},
|
| 171 |
-
{
|
| 172 |
-
"cell_type": "code",
|
| 173 |
-
"execution_count": 4,
|
| 174 |
-
"metadata": {},
|
| 175 |
-
"outputs": [
|
| 176 |
-
{
|
| 177 |
-
"data": {
|
| 178 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAU4AAAD7CAYAAAAFI30bAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3de9gUdf3/8edblDxgCnkMUQjRRLzKI5mmeCmKZqGJCJ7QSkvxmF2KlpmaZXmOb1aUCioXhoKKpzAVPKQihyBFQpAUEQIBf+JZ0c/vj93PzOzN7n3v7M7O7uz9elzXfe3cM3PvfNg399zv+RzNOYeIiJRvvXoXQEQka3TjFBGJSTdOEZGYdOMUEYlJN04RkZh04xQRiamqG6eZDTCz+Wa20MxGJFUoqS/FtXkptsmwSvtxmlkH4BWgP7AEmA4Mdc69nFzxJG2Ka/NSbJOzfhU/uw+w0Dm3CMDM7gIGAiWDYGbtvbf9SufclvUuRBsU1/iyEFeIGVvFtXRcq3lU7wq8Efl+SX6flPZ6vQtQBsU1vizEFRTbuErGtZqM04rsW+cvlJmdDpxexXUkXYpr82oztopreaq5cS4BukW+3w5Y2vIk59woYBQo9c8IxbV5tRlbxbU81TyqTwd6mVkPM+sIDAEmJVMsqSPFtXkptgmpOON0zq01s7OAyUAH4Fbn3NzESiZ1obg2L8U2ORV3R6roYkr9Zzrn9qp3IZKmuCquTapkXKup48yMxx57DICDDz4YgIkTJwbHjjnmmLqUSUSyS0MuRURiatqMs1+/fsH2AQccUHDsyCOPTLk0ItJMlHGKiMTUtBnnjBkzgu2ZM2cC8I1vfKNexZE669y5MwBnnXVWsO/DDz8E4Nprr61LmSS7lHGKiMSkG6eISEztoh+n73J0zz33APDJJ58Ex77whS+kWRT196uhwYMHB9u+Wmb48OEAmOWGaa+//rq1U59++ikAn332WbDvkEMOAeDZZ58t59KKa5m22247AL797W8DsOOOO65zjm/Y3X333QF46KGHgmM+nkuWLEm6aMWUjKsyThGRmNpFxrnrrrsC8PzzzwPQsWPH4Ng222wDwNtvv51GUZSZ1MD5558PwKGHHhrsO+yww6p6T/9UsuGGG5ZzuuJaRJ8+fQA477zzgn2nnnoqAK3dd/zTQWvnjB49GoC//vWvAMydG44cfffddysr8LqUcYqIJKVdZJzeuHHjABgyZEiwb9asWUBhh/mWPv/8cwDef//9aougzKRC3bt3B+Dwww8P9p122mlAmNkUq7+s1nrrlZVbKK4RPh7PPPMMAJ06dYq+J1B9xtnynJdeeik4duONNwJhVloFZZwiIknRjVNEJKY2n23M7FbgSGCFc65Pfl8X4G9Ad+A1YLBzLtHWlXPOOSfY3nLL3HpJ/lH75ZeTW5Rvjz32AODee+8FwhmUolauXAlA//79AZg9e3Zi16+XesW1XL6bip/JqmvX3NI4fgRQW/yMWIsXLwbCxr+xY8cG5/hqmmZT79j6xrpNN920WNna/PlKztltt92C7VtuuQWAvn37AnDGGWe0+X5xlZNxjgYGtNg3AnjcOdcLeDz/vWTLaBTXZjUaxbam2sw4nXNPmVn3FrsHAv3y22OAqcBFSRRo2LBhQFjBC+FflxEjcrH2jTVxtdZ4UCzT9O6//34AXnzxxYqu24jSjmtc++67LxA2NBSzfPlyIOyS4jMNgBUrVgDwwQcfAPCd73wHgEmTylsp4rrrrotZ4sZR79j6xqDWGnd8Y47P+k866aR1zin28++99x4Aa9asAcInkI022midc08/fd01584991ygcBBMJSpthtzaObcMwDm3zMy2KnWiVs3LFMW1eZUVW8W1PDWfHSnuqnl+WGR0zsxBgwYB8bqb/O9//wu2fb3WW2+9BcD3v//94NhOO+1U8HP77bcfENaNAaxatQooHJLX3tV6NcTXX88tae0zDJ/FRON67LHHAvDPf/6z5Pt897vfBeCOO+4Aite7eX7oJcDVV19dSbEzL61VLq+88koApk2bBsCee+4ZHPMDVop54IEHgDBD9UNro0+MP/vZz4BwOLXvtgbhPeAXv/hFVeWvtFV9uZltC5B/XVFVKaRRKK7NS7FNUKUZ5yRgGHB1/vX+pArkO5n/8Ic/DPb5FtGjjjoKCFvZi/HzLU6ePDnYt3DhwoJz7r777mD71VdfLTjmM5qUJhFoNDWLa1xPPfUUEGaKvlX9mmuuCc5pLdP0TxK333470Hqm6evbbr755mCff8poIg0T26g33ngDgAEDBqyzr5ihQ4cWnONjdtVVVwXn+PrtP/3pT+v8fMsnzEq1mXGa2TjgOWBnM1tiZj8g9+H3N7MFQP/895IhimvzUmxrr5xW9aElDpVuhpaGp7g2L8W29hp26Yx33nkn2PbdCnwH+N///vfBsZbdVX7+858DMGfOnGBfy0f1N998s+R1fUXyxRdfXEmxJWF+/sVyjBw5Mtj2cw988YtfLHm+bwzyDQX33XdfBSWUUop1ZC/VuX3p0qXB9r/+9S8gnI+zGN810c/rGR2X7ge1lNORvlIacikiElPDZpzFTJkyBQj/okDY6dl3T/Dza0YbgO68804ALrvsMqD1bkW9evVKsMSSBt9oeMIJJwT7Nt9886Ln+uGzEHZ5e+GFF2pYuvbHdyFrrQP8hRdeCIQDEqId0g866CAgnPn9m9/8Zsn38V2Xoo2Gnr9+LWaAU8YpIhJTpjJOL9pR+eSTTwZg+vTpQJhVbr311sE5P/3pT4GwPvSmm24q+d5+/sXoPIyVDvGU5EUn+bjhhhuAsJN7qSwTwo7WF1xwQbBPmWZt+Lj4tb6i83F6/qnxiCOOAArrl/0M7kcffTQAP/7xj4Nj/vx6L/WtjFNEJCbdOEVEYmrapTOis9v85Cc/if3z0UcB/5iXAC2xEJOvXmlZ3QKFjYQtffjhh0DY1ch3ZYt2e0mQ4lqEn1P3+uuvj74nEDbYPPfcc0DhQns+dq3xv5++O9KPfvSj4FiXLl2KXgvC+Q38/Ltt0NIZIiJJadqMMzqTkp9N/MEHHwSgZ8+ebf68Ms7yJRVXnylA2FXoD3/4AwCbbLJJsesCYUbhFweDMNOcOnVqEkVri+JaxNe//nUAnnzyyWCfnzOg5X3Hdz2CsLEvjv333z/Y9jMv+f8fl19+eXDs2WefBeDjjz8u522VcYqIJKVpM85i/Foo0WF8pbJPZZzlqzauv/vd74Cw4zMUzs9Yip/VyGcYTz/9dHAsOm9nChTXVkRj6buAtXbfqcUyzxVSxikikpSGubWnwXfMjdanzJ8/v+i5PXr0CLYTzDjbvWgndd/a6ofNlZNlRvl5WVevXg0UdpT2MfPzevp5XiV9M2fODLb9k0B0gEpLfo7NM888s7YFq0I583F2M7MpZjbPzOaa2bn5/V3M7B9mtiD/Wt66rdIQFNfmpLimo5xH9bXABc65XYBvAMPNrDdabjTrFNfmpLimIHbjkJndD/xf/qtffsW8bYGpzrmd2/jZujYOeR06dAi2/djX8ePHA7DhhhsC4eMfwGGHHQbAjBkzqr10wzYi1DqufuYivzwrwF571e6j8PMzXnHFFQC89tprNbsW7Tiuce2zzz5A2C2oGP+75wc41HEZm5JxjVXHmV+reXdgGlputGkors1Jca2dsjNOM+sEPAlc5ZybaGb/zzm3eeT42865VutNGiXjLMZ3tPYNDNHZkfyCX8OGDav2Mg2XmaQV17vuuguAwYMHV1nikJ8la82aNQCMGTMmOOa7vfgniRprt3GNy3c18kMe/RNfi+sC8OijjwKFC7mlrLruSGa2ATABGOucm5jfreVGM05xbU6Ka+21+ahuudv/LcA859z1kUMNudxopXyn+H//+99AuHYRhPUyzSTtuN52220ADBo0KNgXzepL8d2I/JyoCxYsCI75JWHLnLChXWj039e1a9fmCpOf+f3gg3Prx3Xs2HGdc/0Q3Oix6Ezx9VROHed+wEnAi2Y2O7/vEnIBGJ9fenQxcGxtiig1org2J8U1BeUsD/wMUGq5OC03mlGKa3NSXNPRrsaqN4CGa0RIQjlx9Qtu7bfffsG+Sy+9FIDZs3OJUXTZ5kWLFgHhvKrR7mENqN3GtVr//Oc/Aejbt2/0ukA4nv3Xv/51cMzPepUSjVUXEUlKuxqrLvXjOzxHOz4XW9JV2hff2BfNOFvaaaed0ipO2ZRxiojEpIxTROrmxhtvBMKlgAG++MUvAuFQy4cffjj9grVBGaeISEzKOEWkbnyPis0226zOJYlHGaeISEy6cYqIxKQbp4hITLpxiojElHbj0Erg/fxr1mxB9eXeIYmCNCDFtTkpriWkOlYdwMxmZHFcb1bLnZasfj5ZLXdasvr51LrcelQXEYlJN04RkZjqceMcVYdrJiGr5U5LVj+frJY7LVn9fGpa7tTrOEVEsk6P6iIiMenGKSISU2o3TjMbYGbzzWyhmY1I67pxmVk3M5tiZvPMbK6ZnZvf38XM/mFmC/Kvra5J3Z5kIbaKa3yKayvXTaOO08w6AK8A/YElwHRgqHPu5ZpfPKb8mtPbOudmmdmmwEzgKOAUYLVz7ur8f6LOzrmL6ljUhpCV2Cqu8SiurUsr49wHWOicW+Sc+wS4CxiY0rVjcc4tc87Nym+/C8wDupIr75j8aWPIBUcyElvFNTbFtRVV3ThjpPJdgTci3y/J72toZtYd2B2YBmztnFsGuWABW9WvZLUV8xEtc7Ftr3GF5v6dTTOuFd8486n8H4DDgd7AUDPrXer0Ivsauh+UmXUCJgDnOefW1Ls8aYkZV8hYbNtrXKG5f2dTj6tzrqIvYF9gcuT7i4GLWzuX3Affnr/eqvTzTusrTlwj59f7c633V8PHtcLf2Xp/rvX+KhnXamZHKpbKr7PGp5mdDpwO7FbFtZrF6/UuQBnixlWyEVcoI7aKa4GSca2mjrOsVN45N8rlZik5usj50nhixdVlcOacdqzN2Cqu5anmxrkE6Bb5fjtgaamTnXONt8anFBMrrpIpim1CqrlxTgd6mVkPM+sIDAEmJVMsqSPFtXkptgmpuI7TObfWzM4i1+jTAbjVOTc3sZJJXSiuzUuxTU6qsyOZWXoXa0wzm7HuSHFVXJtUybhqkg8RkZh04xQRiSntVS5rZpNNNgFgxIjcKLJjjjkmOLbzzjsDsGrVKgAmTpwYHLvxxhsB+M9//pNKOUUk+5RxiojE1DSNQxMmTABg4MCB/lrBMf9v9Pui/+YPP/wQgJNPPhmAe++9t1ZFBDUitKpLly7B9siRIwEYMmRIwTlLl4bdDu+77z4ALrvsMgBWr16dRDEqobg2JzUOiYgkRTdOEZGYMv+ofsABBwBw++23A/DnP/8ZgN/85jfrnLvlllsCcPHFFwf7zjvvPADmzZsHwK677pp0EaP0SFfE17/+dQD+9re/Bft69uxZznUBWLBgAVD4WD979uxqihSX4tqc9KguIpKUzGecjzzyCBA26owaFW8d+s8++wwIG4zWX7+mPbSUmUSst17u7/add94JwHHHHRcca/n/8sknnwTC7BJg6NChQNgV7e677w6OnXDCCQB8/vnnlRQtLsW1iL32yn0k06dPD/a99NJLAEydOhWAl1/OLWH08MPhHECvv144m9vGG28cbPfo0QOAV199FYCPPvqomiK2RRmniEhSMt8BfosttgDgqaeeKvtnfF0nFHZbknR96UtfAmDw4MFtnuvje/nllwf7rrnmGgDmz58PwLHHHhsc++1vfwukXtcpEWvXrgXg008/Dfb5NoSWbQkff/xxsP3WW28VHOvYsWOwvdVWuaWDhg8fDsAf//jHBEtcPmWcIiIx6cYpIhJTm4/qZnYrcCSwwjnXJ7+vC/A3oDvwGjDYOfd27YpZ2kknnQSUN9b8Zz/7GQA//OEPg32+ESI6fr09aIS4+lFbvqJ/xx13DI4ddVRuGWw/OqhPnz7r/Lz/uWL8CLL2+KjeCLGF8LN//PHHg337778/EFar+O6E/fr1C87ZZpttAOjWLTdZ/YoVK4Jjzz33HAALFy6sUanLU07GORoY0GLfCOBx51wv4PH895Ito1Fcm9VoFNuaajPjdM49lV/oPWog0C+/PQaYClyUYLnKVirTPPHEE4Pt6667DggbhaJdXXzj0Pe+9z0g/AsYp7Epixohru+99x4ATzzxBAC9evUKjq1Zk1sae9CgQQBccsklAPz0pz8Nzrn22muB4g187bnRrxFiG3XrrbcG2wceeCAQxsfHNco3Bn3hC18AChuXatz9qGyVtqpv7ZxbBuCcW2ZmW5U6UcuNZori2rzKiq3iWp6ad0dyzo0CRkE6Q7i++tWvAmGWCWG3F59pRjNOn7H6OTt93VqzZ5zVSjKud911FwCnnx7+vvo6ygsuuAAI6zpLlKWsfdK2Wvy+Rgcm+OHOf/3rXwHo2ze3rPuSJUuCcz755JOC10ZUaav6cjPbFiD/uqKN8yUbFNfmpdgmqNKMcxIwDLg6/3p/YiWqkB925+fljHZy99nHo48+CoQt8RDWtbzwwgtAOOmHHwYIMGvWrFoVu9HUJa6+DssPfwU466yzgHBI7WOPPZZGUZpZQ/zO+p4tfojlRRflqlnPPvvsehSnYm1mnGY2DngO2NnMlpjZD8h9+P3NbAHQP/+9ZIji2rwU29orp1V9aIlDBydcFkmR4tq8FNvay/xYdc+PWfcNQdHxrr/+9a8BuOmmm0r+/MqVKwHYfvvtAdhll12CY+3oUb0unn32WaBwnlQ/1vyWW24B4LDDDgO0qF7W+c7wPo4//vGPgcIGpCw0zGrIpYhITE2Tcfo5/PxwrUr5xiI/NAxg7NixVb2nlCfahcw34Pmhlr6R6Pzzzw/Oaa2Lkp/3URqL72J02mmnATBlyhSgcGYrZZwiIk2oaTLOavk1h/bYY486l0QgXIfonnvuAeDoo48Gwu5mEM4g72d598M0IcxkpDE988wzAIwfPx4I6zohfMJ7/vnn0y9YmZRxiojEpBuniEhMmXxU96OEIHyEO/fccwHYe++9K3pP3x2pPc+q04jOOOMMAGbOnAnAlVdeGRzzj+h+ZJhf+Avggw8+SKuIUoWbb74ZgOOPPz7Yd/XVub75AwbkZsZrlBmRopRxiojElMmM08+ABDB69GigsDtDJXyHd82q01j8QAbfIX7u3LnBMb8ktOdn2gF45ZVXgDBDve2224BwATFpDP5JItoFyc/Z2b17d6AxBz0o4xQRiSmTGecdd9wRbK9atQpYN/uIyw/Z9HWcvruENAZfn/nOO++UPGfOnDnBdu/evYFw+Vg/w390vak333wz8XJKPL7+8sILLwz2Pf300wD84he/AArrPxuFMk4RkZgymXH62doBRo0aVdV7+fpS/+rr1PxfPWl8ixYtAsL1ogCGDBkCwJ///GcADj30UKCwU/UxxxwDhHOxSv1MmzYt2PYTgfh2C9/yDo3zJFjOfJzdzGyKmc0zs7lmdm5+fxcz+4eZLci/dq59cSUpimtzUlzTUc6j+lrgAufcLsA3gOFm1hstN5p1imtzUlxTUM5ExssAvzreu2Y2D+hKHZcb9WOUoXCJjEpcddVVAGy88cYA/OUvfwFg8eLFVb1vo2vEuMbl/x/07NkTKBz84Ofx/Pvf/17w6huNIFwepX///kA4w1aWNUNcL730UiCcocwvAw3wzW9+EwgbC+slVh1nfq3m3YFpaLnRpqG4NifFtXbKvnGaWSdgAnCec25NuUMTa7HcaHRonV/O1y8tO3HiRCAcQhnls9Pbb7892OcbDfx7+tni24tGimtcLYdcDh8+PDjmG4F8lyM/fO+JJ54IzvGZ6imnnALA5ZdfXtsCpyjLcZ0xYwYADzzwABA29EG4EoCfn7VeyuqOZGYbkAvCWOfcxPxuLTeacYprc1Jca8/aGmJouT9VY4DVzrnzIvuvAVY55642sxFAF+fchaXeJ/8zifwFiw659H95dthhByDMPqJ/YVvui/6bWy4ZXCxTTdBM59xetbxAuRoxruXYYIMNgm0/h+MNN9ywznmTJ08GwiGXvsvSpEmTgnP22muvgnOqyDgV1xro1KkTAMuXLw/2+fWpfL10jZWMazmP6vsBJwEvmtns/L5LyC0vOj6/9OhioLrB4pI2xbU5Ka4pKKdV/RmgVAWJlhvNKMW1OSmu6cjkyKHobCm+C4p/3PKzHH3rW98KzvGP5n40UHRce2tLBkvj+fTTT4PtkSNHArDjjjsCcMIJJwTHfCOCfy1WTeNHiTVDN6Rm9N577wGFi/j5xfr8Ejf1WrpbY9VFRGJqs3Eo0YvVubK5ATRMI0KSGiWuBx10ULDtu5X5JxKfcUbnfRwxIjd4JjpOukKKaw1tt912wfb06dOBcBG/s88+u5aXLhlXZZwiIjEp40yXMpPmpLimZMyYMUA4R2ePHj0AWLJkSS0up4xTRCQpunGKSGaMGzeOcePG8dFHH/HRRx9x4IEHBmsUpUk3ThGRmHTjFBGJKZMd4EWkffLzqm666aZ1LYcyThGRmNLOOFcC7+dfs2YLqi/3DkkUpAEprs1JcS0h1X6cAGY2I4t93rJa7rRk9fPJarnTktXPp9bl1qO6iEhMunGKiMRUjxvnqDpcMwlZLXdasvr5ZLXcacnq51PTcqdexykiknV6VBcRiUk3ThGRmFK7cZrZADObb2YL86vsNSQz62ZmU8xsnpnNNbNz8/u7mNk/zGxB/rVzvcvaKLIQW8U1PsW1leumUcdpZh2AV4D+wBJgOjDUOfdyzS8eU37N6W2dc7PMbFNgJnAUcAq5JVf98qqdnXMX1bGoDSErsVVc41FcW5dWxrkPsNA5t8g59wlwFzAwpWvH4pxb5pybld9+F5gHdCVX3jH508aQC45kJLaKa2yKayuqunHGSOW7Am9Evl+S39fQzKw7sDswDdjaObcMcsECtqpfyWor5iNa5mLbXuMKzf07m2ZcK75x5lP5PwCHA72BoWbWu9TpRfY1dD8oM+sETADOc86tqXd50hIzrpCx2LbXuEJz/86mHlfnXEVfwL7A5Mj3FwMXt3YuuQ++PX+9VennndZXnLhGzq/351rvr4aPa4W/s/X+XOv9VTKu1cyOVCyV79vyJDM7HTgd2K2KazWL1+tdgDLEjatkI65QRmwV1wIl41pNHWdZqbxzbpTLzVJydBXXkvTEiqvL4Mw57VibsVVcy1PNjXMJ0C3y/XbA0lInO+ceruJakp5YcZVMUWwTUs2NczrQy8x6mFlHYAgwKZliSR0prs1LsU1IxXWczrm1ZnYWuUafDsCtzrm5iZWsTFttletlsHRp4R/O888/P9geOXJkqmXKskaJqyRPsU1OVUtn5B+/9QjeZBTX5qXYJqNpV7ncay/VbYtIbWh2JBGRmJo245Rs++UvfwnAgQceCEC/fv0AmDp1anDOk08+WfRnRGpNGaeISEypLp1hZolfzGcijz32WMH+AQMGBNstj9XRzGbsWJxUXKdMmRJs+7hW6qCDDgIKM9QaUlybU8m4KuMUEYlJN04RkZgy3zh0+OGHA2BWOAz3rbfeqkdxpAK+Uafax/OoYo1J0ni+/OUvAzB8+PBgX7duuVGhG220EQCHHHJIcMz/nt92220F7/PBBx8E23/5y18AeP/994Ha3AuUcYqIxJT5jHPjjTcG8HMIMmlSbujtnDlz6lYmqT/fjUkaS8eOHQH41a9+BcAPfvADADbffPOyft5nnOecc07Jc0aMyE1s/+abbwLwxz/+MTj2m9/8JmaJi1PGKSISUyYzzi233DLYPuOMMwD48MMPAfjd735XlzJJ+nz9ZbG60STrS6U6O++8c7A9fvx4APr06VPy/GeeeQaAV155BYDZs2cHx1oOejjxxBMB+NKXvhTs87H/yle+AsAVV1wRHFt//dwt78orr4z3j2hBGaeISEy6cYqIxNTmo7qZ3QocCaxwzvXJ7+sC/A3oDrwGDHbOvV27YhYaNmzYOvsWLlwIwPPPP7/Ose7duwPQq1evdY6tWrUKgFmzZiVYwsbXiHFtjX8s9yOCotIc/ZYFjRbbCy+8MNgu9Yi+du3aYNt3T4uOJCvFNwRFbbjhhgAcfXRutZ477rgjOOa7PY0dOxaARYsWtXmNYsrJOEcDA1rsGwE87pzrBTye/16yZTSKa7MajWJbU21mnM65p/ILvUcNBPrlt8cAU4GLEixXq0466aSSx7bddlsAzjzzzGDfKaecUnAs6r333gPCvzy+m4OvoG5WjRhXSUajxNZnlz7zi/Kd071NNtkk2B41ahQA3/72t4GwkahcH330EQDjxo0D4LDDDguO+XvHDjvsAFSecVbaqr61c24ZgHNumZltVepELTeaKYpr8yortopreWreHck5NwoYBcnNtuLXGcq/JxAOuXrxxRcB6Ny5c3DOeuvlaiR8Pdlrr70WHDv55JMB+NrXvgbAI488AhTWpc2YMSOJYjeVWsS1Nb6LSbTeq1h9p1QnybieffbZAGy22WbBvo8//hgI2yn872a0HtJ3I3r22WeBsAsTwE033QTA/Pnz27x+//79Aejbt28bZ8ZXaav6cjPbFiD/uiK5IkkdKa7NS7FNUKUZ5yRgGHB1/vX+xErUij322AOATp06Bft8i2rLvyrROsrLLrsMKD7hw6mnngrADTfcAIR1nE888URwjs9sZs6cWVX5M6AucW052zuU7sAe3a/W9FhSj+0222yzzr4NNtgACFvRH3jgAaBwko7LL78cgD333BOAH/3oR8Ex3+Hd15HeeeedQDgABuC0004DoEuXLkDY6R3C1XAXL15c2T8qr82M08zGAc8BO5vZEjP7AbkPv7+ZLQD657+XDFFcm5diW3vltKoPLXHo4ITLIilSXJuXYlt7mRqr3rt3byCcEQnWfVzz3Yt8ug7ldWfwy2v4bkzRaxxwwAFAu3hUr6toY0+1j+H+cU/qx1d3HXnkkcG+Tz75BIDly5cXnOsbZQGmTZsGwPe+9z2gcMDLfvvtB4Tdl37yk5+UvP4777wDwL333hvs83NZvPrqq3H+KevQkEsRkZgytTMcdrcAAAatSURBVFjbxIkTARg4cGCwz5ffdzE67rjjgMqzw+eeew6AvffeO9jnuzjtvvvuFb1nhBb1isl3P4o725HPOFNaMlhxLcI3zkyfPj3Y54c/33PPPUD4+1qugw8uv7bBZ7UvvfRSrGtEaLE2EZGkZKqO0/+1KuaFF14Aqq+HnDBhAlCYcUr9+HrPuJmn74LmpZR5SsTq1asBuPnmm4N91157LQD77LMPEHZZ+t///lfWez7++ONJFrFiyjhFRGLSjVNEJKZMPar7MePRRpoVK3Ijx/yiT9XyS3FElxtuufSwpM839sRtJNIje/35RdMAPvvsMwC23357AHbbbTeg/Ef1RqGMU0Qkpkx1R/Id4KMVxH7hNj/2dcCA3Pytxcalt+b003MzafmlRP1fRoATTjgBgLvvvruCUhdQt5UKJfX/NNoxPsHsU3Et07x58wDYaaedgDAbPeKII4Jzqug+lDR1RxIRSUqm6jhffvllAA455JBgn88+3347t3xKOd2RNtpoo2D7ootyk2D7tUi86PyACWSaUqHW6jT9U4VfMrZlfWYx0RmYJH1+qV4/q1HXrl2BwuV6i80Y32iUcYqIxNRmHaeZdQNuB7YBPgdGOeduqmTVvFrUmfhM0c8M7Ver9PNrAgwaNAgIJ+6Izue57777tiwjAD179gz2RWeMr1LD1IU1ely9YjO/t+Q7yUfrtcvpMJ/gsEzFtfzyAeHv7XXXXQcU1mH7J8oGWPerqjrOtcAFzrldgG8Aw82sN1o1L+sU1+akuKagzRunc26Zc25WfvtdYB7QldyqeWPyp40BjqpVISV5imtzUlzTEas7Un7J0aeAPsBi59zmkWNvO+c6l/hRf07iqb9fCMp3ju/Ro0ex6wKtd2n573//C4TLhz7//POJljOvYR7pohoxrl45j+pesS5o5XSYT2CAg+JaIT9X5ne/+91gn1+y1y+dsWbNmloXo5SScS27Vd3MOgETgPOcc2vK/c+m5UYbm+LanBTX2ior4zSzDYAHgcnOuevz++YD/fJrNG8LTHXO7dzG+9TsL9iOO+4IhB3Zo/xMR34m94cffjg4NnnyZCDsfuRnja6RhspMshBXzzfglNPlqFw+Q01gmWHFtUJbbLEFULjc7+ab5xLjhx56CIDjjz8+OOZXeEhJ5Y1DlvtTdQswzwchz6+aBymuiCjJUFybk+KajnK6I+0PPA28SK57A8AlwDRgPLA9sBg41jm3uo33au/ruTZMZpLVuEbrOuNO+NFSgpO3KK5VinYJu/TSSwuO1ahrYDkqr+N0zj0DlPofplXzMkpxbU6Kazo0ckhEJKZMzY7UBBrmkS5J9YprnK5KfpRQtMtS3Bm0WqG4Vim6HPcbb7wBhI1El1xySXDst7/9bVpFAs2OJCKSHGWc6VJm0pwU1wQdfvjhADz44IPrHOvQoUOaRVHGKSKSlEzNxykizW/OnDlAODv8ypUr61mcopRxiojEpIxTRBrK0qVLgXAlzEakjFNEJCbdOEVEYtKNU0QkJt04RURiSrtxaCXwfv41a7ag+nLvkERBGpDi2pwU1xJSHTkEYGYzsjjKIqvlTktWP5+sljstWf18al1uPaqLiMSkG6eISEz1uHGOqsM1k5DVcqclq59PVsudlqx+PjUtd+p1nCIiWadHdRGRmFK7cZrZADObb2YLzWxEWteNy8y6mdkUM5tnZnPN7Nz8/i5m9g8zW5B/7VzvsjaKLMRWcY1PcW3lumk8qptZB+AVoD+wBJgODHXOvVzzi8eUX3N6W+fcLDPbFJgJHAWcAqx2zl2d/0/U2Tl3UR2L2hCyElvFNR7FtXVpZZz7AAudc4ucc58AdwEDU7p2LM65Zc65Wfntd4F5QFdy5R2TP20MueBIRmKruMamuLYirRtnV+CNyPdL8vsampl1B3Yntyb11s65ZZALFrBV/UrWUDIXW8W1LIprK9K6cRZb57mhm/PNrBMwATjPObem3uVpYJmKreJaNsW1FWndOJcA3SLfbwcsTenasZnZBuSCMNY5NzG/e3m+PsXXq6yoV/kaTGZiq7jGori2Iq0b53Sgl5n1MLOOwBBgUkrXjsXMDLgFmOecuz5yaBIwLL89DLg/7bI1qEzEVnGNTXFt7bppdYA3syOAG4EOwK3OuatSuXBMZrY/8DTwIvB5fvcl5OpNxgPbA4uBY51zq+tSyAaThdgqrvEprq1cVyOHRETi0cghEZGYdOMUEYlJN04RkZh04xQRiUk3ThGRmHTjFBGJSTdOEZGYdOMUEYnp/wM6qn2w8AQFsAAAAABJRU5ErkJggg==\n",
|
| 179 |
-
"text/plain": [
|
| 180 |
-
"<Figure size 432x288 with 9 Axes>"
|
| 181 |
-
]
|
| 182 |
-
},
|
| 183 |
-
"metadata": {
|
| 184 |
-
"needs_background": "light"
|
| 185 |
-
},
|
| 186 |
-
"output_type": "display_data"
|
| 187 |
-
}
|
| 188 |
-
],
|
| 189 |
-
"source": [
|
| 190 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 191 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 192 |
-
"from matplotlib import pyplot\n",
|
| 193 |
-
"from tensorflow.keras import backend as K\n",
|
| 194 |
-
"\n",
|
| 195 |
-
"# Load data\n",
|
| 196 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 197 |
-
"\n",
|
| 198 |
-
"# Reshape our data to be in the forma [samples, width, height, color_depth]\n",
|
| 199 |
-
"x_train = x_train.reshape(x_train.shape[0], 28, 28, 1)\n",
|
| 200 |
-
"x_test = x_test.reshape(x_test.shape[0], 28, 28, 1)\n",
|
| 201 |
-
"\n",
|
| 202 |
-
"# Change datatype to float32\n",
|
| 203 |
-
"x_train = x_train.astype('float32')\n",
|
| 204 |
-
"x_test = x_test.astype('float32')\n",
|
| 205 |
-
"\n",
|
| 206 |
-
"# define data preparation\n",
|
| 207 |
-
"train_datagen = ImageDataGenerator(vertical_flip=True,\n",
|
| 208 |
-
" horizontal_flip=True)\n",
|
| 209 |
-
"\n",
|
| 210 |
-
"# fit parameters from data\n",
|
| 211 |
-
"train_datagen.fit(x_train)\n",
|
| 212 |
-
"\n",
|
| 213 |
-
"# configure batch size and retrieve one batch of images\n",
|
| 214 |
-
"for x_batch, y_batch in train_datagen.flow(x_train, y_train, batch_size=9):\n",
|
| 215 |
-
" # create a grid of 3x3 images\n",
|
| 216 |
-
" for i in range(0, 9):\n",
|
| 217 |
-
" pyplot.subplot(330 + 1 + i)\n",
|
| 218 |
-
" pyplot.imshow(x_batch[i].reshape(28, 28), cmap=pyplot.get_cmap('gray'))# show the plot\n",
|
| 219 |
-
" pyplot.show()\n",
|
| 220 |
-
" break"
|
| 221 |
-
]
|
| 222 |
-
},
|
| 223 |
-
{
|
| 224 |
-
"cell_type": "markdown",
|
| 225 |
-
"metadata": {},
|
| 226 |
-
"source": [
|
| 227 |
-
"### Random Shifts"
|
| 228 |
-
]
|
| 229 |
-
},
|
| 230 |
-
{
|
| 231 |
-
"cell_type": "code",
|
| 232 |
-
"execution_count": 5,
|
| 233 |
-
"metadata": {},
|
| 234 |
-
"outputs": [
|
| 235 |
-
{
|
| 236 |
-
"data": {
|
| 237 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAU4AAAD7CAYAAAAFI30bAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO2debBU1Z3HPz8JrqCCCyKgoKKCGGXcQM24wqBi1CRampjRiSnjjElcMomOVVMzVcYySc1YMZVMOagEknFwGbREETcUl6goIsomiMaF+GQRUERFCWf+eP275za8pe973ff17ff9/NO37z3d97z+vj79O+f8FgshIIQQonK26eoOCCFE0dDAKYQQGdHAKYQQGdHAKYQQGdHAKYQQGdHAKYQQGenUwGlm48xsiZktM7Nrq9Up0bVI18ZF2lYH66gfp5n1AJYCY4DlwEvABSGERdXrnsgb6dq4SNvq8ZVOvPZoYFkI4S0AM7sTOAtoVQQz6+7e9qtDCHt0dSfaoa517dmzZ3J88MEHA/CVrzT/G2+zTZxAbd68GYD58+cDsGnTplp2qwi6QkZt9X1tXdfODJwDgPdSz5cDx3Ti/boD73R1ByqgrnXdfffdk+OHHnoIgF122QWA3r17J9c++eQTAIYMGQLA6tWra9mtIugKda5tHdKqrp0ZOK2Fc1v9QpnZpcClnbiPyJe61vWvf/1rcrzrrrsC0KtXr63avfde8/iwcePGfDpWDNrVVt/XyujMwLkcGJR6PhB4f8tGIYQJwASQ6V8QpGvj0q620rUyOrOr/hIw1MyGmNm2wPnAtOp0S3Qh0rVxkbZVosMWZwhhk5n9EHgE6AFMDCEsrFrPupgePXokx0OHDgXgtNNOA+I0EGDGjBkAbNiwIcfe1Y561dWseZa59957J+dch2HDhgGwbt26rdqLSL1qW0Q6M1UnhPAQ8FCV+iLqBOnauEjb6tCpgbOR6devX3L8/e9/H4ALL7wQgCeeeCK5NmfOHKBxLM56xV2OfvrTnybntt9++7I2K1asSI5dv2233TaH3onuhkIuhRAiI7I4t8DXxgYOHJic87VNtyofffTR5Nr69etz7F33xZ3bv/Wtb211zjnooIOS4ylTpgDRn1OIaiKLUwghMiKLcwt8N3233XZLzrn1uXBh8wakR6wAfPjhhzn2TqxcuTI53nHHHQHo27cvAOm8C3/4wx8AOcCL2iCLUwghMqKBUwghMqKp+hbsvPPOAHz1q1/d6pzHSX/55Zf5d0wA5dPxtlIivv322zn0RnRXZHEKIURGZHGW8E2hAw44AIDx48cn19zC9JA+WZz5c+yxxwIwaNCgdloKUXtkcQohREa6vcXpTtTucvSNb3wDgMMPPzxp09TUBMD06dMB+Oyzz/LsYrfGAxImTJjQxT0RWfG9Ac/UDzBgwAAAtttuOyCGyS5dujRp8/77zZnuOlrWJw9kcQohREY0cAohREbanaqb2URgPLAyhDCidK4vcBcwGHgbOC+EsLZ23awunmkHYPjw4QBcdtllAJx++ulAeT7OF198EYD77rsPKC/fUFSKputee+3V1V0oDF2trZc08RwPP/jBD5JrI0aMAOJ30HOq+jIYwJ133gnAvHnzatG9qlCJxTkJGLfFuWuBmSGEocDM0nNRLCYhXRuVSUjbmtKuxRlCeNrMBm9x+izgxNLxZGAWcE0V+1VT0pUSzz//fAAuuugiIGZAcusS4LbbbgPK8z0WnaLpunjxYgCOOuqoLu5J/dMV2qbznh5zTHPhzKuuugoozzTmmcU854C3/ad/+qekjeeJ8Nyr6cz+9UJHd9X7hRCaAEIITWa2Z2sNVTWvUEjXxqUibaVrZdTcHamequbtsMMOAIwaNSo55+5HX3zxBQAPPPAAAL/5zW+SNq+++mpeXSwMeenqLik33ngjAPfee2+tbiXIrqu7i+23337JOa+U4K5H//Vf/5Vcu/XWWwH46KOPADjjjDMA+OUvf5m0OfLIIwE47LDDAHjqqac68qfUlI7uqq8ws/4ApceV7bQXxUC6Ni7Stop01OKcBlwE/KL0eH/VelRDfG1z3Li4bu4hfM8++ywQfx3bsjJ32WWXrV7vu4P+S1pQ6lZX33VN77SmgxREu9REW99BP/PMM5NzZ599NhArwLqVCVvvE7z88ssAzJw5Mzk3ZswYoHxttN5o1+I0synA88BBZrbczC6h+cMfY2ZvAGNKz0WBkK6Ni7StPZXsql/QyqVTqtwXkSPStXGRtrWnW8Sqezy6L1aPHj06ueZT68cffxyAuXPntvo+PtU/99xzk3Mnn3wyEKcat9xyS7W6LVJ4Rqp0eeAtN4o8/hlg6NChQIyB3rx5c6272C3x79Txxx+fnFuzZg0Qp+ptufGtWrUKiGVpIE7V6xmFXAohREa6hcXpxbxOOukkAAYPHpxcW7RoEQBPP/00EN1ffNEbYOTIkQB87WtfA6K7BUQ3DC8cJouzNrguL7zwQnLOQ/N8s27s2LHJtZ/85CdAdDN7/fXXAXjnnXdq39luhDu+9+rVKzn35ptvAuVltNt7/U477VSD3tUOWZxCCJGRhrU43TEXoluDWySff/55cs1/FX2NpXfv3kD5Osvll18OxF/VtNXiazxuofo66OrVq6v1p4gUn3zySXJ89dVXA3DggQcC5e5JQ4YMAWIo7cMPPwzEssGiOniyju233z455+vJlViRe+yxBxCTf6Sp53VpWZxCCJERDZxCCJGRhp2q+xQAYl5A38iZM2dOcu2ee+4BYOPGjQAcffTRAPzoRz9K2vi0zyMgPEoI4saRT+P9Hpqq1x6ftnuU13/+538m19w1acqUKYDyedaKDz/8EIjZqwBOPfVUIGYemzZtWnLtgw8+AGD9+vVA3KhNuwh6aZp6zIrkyOIUQoiMNJzF6YvVhxxySHLukksuAaIjfNpNwt1UfFPHywKn8z56dpa7774biBYsQM+ePYHo9Ov5PEV+eEb+tgq6aQZQG7yQoWcVAxg2bBgA3//+94HybGQLFiwAYkG2gw46CIizOoiO8962HpHFKYQQGWk4i7N///5AeVikr6P86U9/AmJ4JUQ3CneO9/ycnqEaYjhlv379tnpvt2TuuOMOIDr/CtEd+PTTTwF47LHHknMexnzOOecA0V0M4lqmu++5dZp2M5s/fz4Ay5cvr1W3O40sTiGEyEjDWZweXum742l85zu95nLEEUcAcQfwgAMOAGIoJsSwMK+hcuihhybXnnzySQCmTp0KlDvXC9FdcMsTYNasWQA899xzQHmNL/9+nnDCCQDccMMNQFwrhRhW62G29Ugl+TgHmdmTZrbYzBaa2RWl833N7DEze6P02Kf23RXVQro2JtI1HyqZqm8CfhJCGAaMAi43s+Go3GjRka6NiXTNgUoSGTcBXh1vvZktBgZQp6Vk3Wk2nVfTXR58Gn7NNbGb7mybzpgEsPfeeyfH3/72t4Ho3J529nUXpbVr11al/3lRNF1FZdSTrp6Zyl2PIG4CuWtgjx49gPLv1DPPPFPLblWFTGucpVrNI4HZqNxowyBdGxPpWjsqHjjNrBcwFbgyhPBxOvtQW+RdHtizTf/3f/93cm7ZsmVAzJ6TznzkmcK3xBexIf46PvjggwDcddddybVKcg7WM0XRVWSjXnV1R3cPIvFQ59mzZydtihCsUJE7kpn1pFmEO0IIXq9A5UYLjnRtTKRr7WnX4rTmn6rbgcUhhJtSl+qylKy7A3nZUYgOte6I6+4SALvttlvZ61tygXCH3ueffx6A1157LblWzzkD26JouorKqEdd07k6Pe+muwH6d9Nnc0Whkqn6ccB3gflm5kWtr6NZgLtLpUffBc5t5fWiPpGujYl0zYFKdtWfBVpbIFG50YIiXRsT6ZoPDRc51BLuFuHZVuo564oQjUbatc/Labsbkm+uepayoqBYdSGEyEi3sDiFEPnjm0JeJQFijLoXR/QKDJ5TtSjI4hRCiIzI4hRC1AQPUT7vvPOSc77e6Vne01mRioQsTiGEyIgsTiFEVfG1zSOPPBIor9G1dOlSIOavLWqNLlmcQgiREQ2cQgiREU3VhRBVpXfv3kDMf+ulZyDme5g3b97WLywQsjiFECIjlmdBJDNbBWwA6j/h3tbsTuf7vW8IYY9qdKaekK7StQ6pqa65DpwAZjYnhHBkrjetAkXtd14U9fMpar/zoqifT637ram6EEJkRAOnEEJkpCsGzgldcM9qUNR+50VRP5+i9jsvivr51LTfua9xCiFE0dFUXQghMqKBUwghMpLbwGlm48xsiZktM7Nr87pvVsxskJk9aWaLzWyhmV1ROt/XzB4zszdKj326uq/1QhG0la7Zka5t3DePNU4z6wEsBcYAy4GXgAtCCItqfvOMlGpO9w8hzDWz3sDLwNnAxcCaEMIvSv9EfUII13RhV+uComgrXbMhXdsmL4vzaGBZCOGtEMIXwJ3AWTndOxMhhKYQwtzS8XpgMTCA5v5OLjWbTLM4oiDaStfMSNc26NTAmcGUHwC8l3q+vHSurjGzwcBIYDbQL4TQBM1iAXt2Xc9qS8YpWuG07a66QmN/Z/PUtcMDZ8mU/x1wGjAcuMDMhrfWvIVzde0HZWa9gKnAlSGEj7u6P3mRUVcomLbdVVdo7O9s3rp2eI3TzEYD/x5C+LvS838BCCHc2FpbYGyHe5oRr9sMsPPOOwOwxx7N8fq9evVKrnl1vc8//7zs+XbbbZe0+cpXmrPvrV27FoAVK1Yk1zZu3JilW6vrPRlEFl1T7Z/Lr4d1Sd3rCh36zkrXVnTtTD7Olkz5Y7ZsZGaXApcCh3biXpnZZZddkuOxY5vH6x/84AcAjB49Orn20UcfATGl/7p164CYSxBgzz2brfy77roLgJtuuim5tmzZsizdeidL4y4iq66iGLpCBdpK1zJa1bUzA2dFpnwIYQIwwcxOB6Z34n6ZOP7445Pjq666CoCvfvWrALz++uvJtWnTpgHw1FNPAbB69eqtXn/ZZZcBcMwxzf9j+++/f3It48BZBDLpCmBmdTuFE2W0q610rYzObA4tBwalng8E3m+tcQjhoU7cS+RHJl1FoZC2VaIzA+dLwFAzG2Jm2wLnA9Oq0y3RhUjXxkXaVokOT9VDCJvM7IfAI0APYGIIYWHW99lhhx2AuCnzxRdfdLRLZQwfHjcLvQbKjBkzgLhWCTBz5kwgTtGdTz75JDk++eSTATjwwAOBuFnUiFRLV1F/SNvq0akRoDT91hS8wZCujYu0rQ5dbjpts03zakG1Qz9feOGF5HjNmjUAvPjiiwAsWhSjxlqzcHv27Jkcu4X55ZdfAtE6FkJ0T5QdSQghMtLlFueGDRtq8r6zZs1q8bhSBgyI0WW77747EK1TtzyFEPWHzxZ9Ntu3b9/kmvt3+yxyp512AqKvNsRAl2effbbVe8jiFEKIjHS5xVlv+K/Ufvvtl5xzi3PevHkArFq1Kv+OCdFNSVuDe++9N1AeUr0lgwcPBqI3zUEHHZRc84hAt0r79+8PwGGHHZa0efPNNwE45JBDWr2HLE4hhMiIBk4hhMiIpupbMGzYMACOO+645JxvCrmzvJvyQoja881vfjM5PvPMM4G4qdOSG+OIESMA6NOnuVqGL79Vio8BbSGLUwghMiKLs4QvOl944YUAjBkzJrn2xBNPANE9oVYuVEKIrTn44IOT42OPPRYoTxvZGh42nQ5Y8U2lHXfcEWjZGn3ttdfafW9ZnEIIkZFub3H6L8/pp58OwDnnnAPAxx/H7PsPPvgg0JC5N4WoeyZOnJgcu4tRv3792n3d8uXLgXKL89RTTwVgyJAhQLQ4PaE5wC9/+ct231sWpxBCZEQDpxBCZKTdqbqZTQTGAytDCCNK5/oCdwGDgbeB80IIa2vXzdZxU9tNd49LbakgW0uZkDw64Dvf+Q4QI4ZeeeWVpM2+++4LxByfS5YsSa599tlnVfgr8qfedRUdp9G0ffXVV5PjH//4x0DbkUMeFeQuS4cffnhybfz48WWv9yW5e+65J2nz6KOPttunSizOScC4Lc5dC8wMIQwFZpaei2IxCenaqExC2taUdi3OEMLTpULvac4CTiwdTwZmAddUsV8tYtZcaypdLM3dE/7mb/4GgL322gsod1fYtGkTEEsAp/Gs7h7P6m3SlqS/p8fMvvHGG539U7qcetLVLYR0Zn3X2rXzjFTVztvaiNSTttXGXYx8RukzTd80gjg+uCO8b/hC/C77/9fixYsBuOGGG5I2nr+3LTq6q94vhNAEEEJoMrM9W2uocqOFQro2LhVpK10ro+buSNUoN+rrmEcddRQAF198cXLtxBNPBOL6ZVNTU9lrAHbbbTcgrl/6c4APPvgAiOGUngEpnUHeM8b7e3/66acd+TMaio7q6r/0EGcF1113HQDf+973kmuu0YQJE4DoIvLWW2+1+p5pi9X137x5c4uPpb+h0m53G+qxPPB2222XHHs45OjRowE44ogjgJgRCeKaZiUhl/7/lA5qqeT/oqO76ivMrD9A6XFlB99H1BfStXGRtlWkoxbnNOAi4Belx/ur1qMWcEvx0kubZxBnnHFGcu2ZZ54B4N577wXi+mN6V93b+3qIV9YEeOCBBwC45ZZbAHj99deB8jXObmSZ1FxX91AA+NWvfgXAWWedBcC22267VXtfp/K8iWmL0y1NX5/+9re/nVzz/IoePrdwYXMxx+eeey5p8+6773bmTykauX5nq4k7qwNcf/31QAyJTlujleBr5QsWLADi+JHVO6Zdi9PMpgDPAweZ2XIzu4TmD3+Mmb0BjCk9FwVCujYu0rb2VLKrfkErl06pcl9EjkjXxkXa1p5CxKqfckqz3ieffDJQ7g502223ATGDkZe58E0jgJEjRwLR7WX69OnJtUmTJgFxUyi9eSCqhy/Qp13JBg0aBLQ8RXcd3nvvPQDWrVu3VZtDDz0UiNO3ceOi66K/59e//nUgxi37kgzA5MmTAfjLX/4CdKslmUKx6667Jse+oehO8b4p7JtEUL4UB+XFFR955BEAfvOb3wDw/PPPA9HNqVIUcimEEBkphMXpTtAeOukuRBA3gU477TQAjjnmmLLnEDMgeVjVH//4x+Sa/3LJ0qwt7j5y7bUxYMWtBLf00hp4WKtr5pmpPG8qwE9/+lMAxo4dC5S7nWzcuBGIoXVe7vlf//VfkzZuydx8881AtEpFffHyyy8nx76huHr1aiC6oE2ZMiVpM3DgwLLX//nPf06O/X/G/786OsuQxSmEEBkphMU5a9YsIObS8/BKiCFUblX62tbSpUuTNo8//jgAM2bMAODtt9+uaX/F1nitGA9igGgt+EwinVzh3/7t3wCYM2dO2fucdNJJybFbrO687K4lAA8//DAAo0aNKru/J4GB6KbmwQ++1gla76wn0muUnhvX1zHPO+88ALbffvutXudW6R133JGc8yCWzuori1MIITKigVMIITJSiKm6T6F8qu2bRRBN9hUrVgBbRwRAnLa3lI9TdB2+GeRTZZ+ew9ZTdMc3eSBu/Pz85z8H4Pe//31yzUshTJ06FYibRenoogMOOACIrmvpIl3vv/9+1j9H5Igv11199dVAucuSa+3RhL/73e+Sa+kSGZ1BFqcQQmSkEBan58j0uHIv0wvRBcVz6LmjtNyL6p8PP/wQiE7prVmZadIbBe687PHnLVkTvkHgmwJujUDM4ejx82mrRRZnfeJZ3c8//3wg5rFIZ8Z6+umnAbj11luB6lmZaWRxCiFERgphcTpr164texTdDw+Vg5Yd57fE10FbqlHjr/c1c81S6h93IfNHd0NMZ23/j//4DyCuWaf3RKqFLE4hhMhIoSxOIdxZvlI8yYdbKGkHeK9w6IkfPKxT1C+eu9Xzs/oeR7piw5YJQGpBJfk4B5nZk2a22MwWmtkVpfN9zewxM3uj9NinZr0UVUe6NibSNR8qmapvAn4SQhgGjAIuN7PhqNxo0ZGujYl0zYFKEhk3AV4db72ZLQYG0CDlRrsrjahruhCcl0nxqZ0X+Uq38Sn6K6+8AtRmEyFvGlHXNB4Ms+WGXrokStbcmh0h0xpnqVbzSGA2KjfaMEjXxkS61o6KB04z6wVMBa4MIXyc/uVui3osNyoieenqmzotufy4q1A6n2Zr2WvSbbZ8XbpErIfi+eaQZ8/xYAqIeRpbyi5fdBr1+/q3f/u3QAy1du3SFmfWwmsdoSJ3JDPrSbMId4QQ7i2dVrnRgiNdGxPpWnvatTit+afqdmBxCOGm1KXClhsV+es6e/ZsoDwzv687epKNdD5MD5XckhNOOCE59qzyXocmnetzn332AaKl6zk7PaM8xHpT6fsWne72fV25snn8T7uSpcNya0UlU/XjgO8C881sXuncdTQLcHep9Oi7wLm16aKoEdK1MZGuOVDJrvqzQGsLJCo3WlCka2MiXfNBkUOipvimhE+p1q9fn1zzDZsf/vCHZY+V4htILW08uUuKZ9TyshxeRhpi6WFRHFwzLw29ePFioDzrVR4oVl0IITIii1PUFM91eeWVVwJw9NFHZ3q9W5MtZTByt5OHHnoIiCVfIRb18gw5ynzUGEycOBGIOTY963/eGdNkcQohREZkcYqa4E7pY8aMAWIp33QZ1y2twLSTtq9fejbvu+66C4iZbwDeeecdAFatWgU0RsikaBuvO+aPXYUsTiGEyIgsTlETPDTu4osvBmD//fffqo07LfuOaDpk8oMPPgBixvfp06cD+Tg3C9EesjiFECIjGjiFECIjmqqLmvDpp5+WPbpbUbqM66JFi4DoFN+nT0xKPnToUCC6L3kp4BUrVtSy20JUhCxOIYTIiLWW97AmNzNbBWwAWk59U9/sTuf7vW8IYY9qdKaekK7StQ6pqa65DpwAZjYnhHBkrjetAkXtd14U9fMpar/zoqifT637ram6EEJkRAOnEEJkpCsGzgldcM9qUNR+50VRP5+i9jsvivr51LTfua9xCiFE0dFUXQghMqKBUwghMpLbwGlm48xsiZktM7Nr87pvVsxskJk9aWaLzWyhmV1ROt/XzB4zszdKj33ae6/uQhG0la7Zka5t3DePNU4z6wEsBcYAy4GXgAtCCItqfvOMlGpO9w8hzDWz3sDLwNnAxcCaEMIvSv9EfUII13RhV+uComgrXbMhXdsmL4vzaGBZCOGtEMIXwJ3AWTndOxMhhKYQwtzS8XpgMTCA5v5OLjWbTLM4oiDaStfMSNc26NTAmcGUHwCkSwouL52ra8xsMDASmA30CyE0QbNYwJ5d17PaknGKVjhtu6uu0Njf2Tx17fDAWTLlfwecBgwHLjCz4a01b+FcXftBmVkvYCpwZQjh467uT15k1BUKpm131RUa+zubt64dXuM0s9HAv4cQ/q70/F8AQgg3ttYWGNvhnjYGq+s9GUQWXVPtn8uvh62z8847J8d77bUXADvuuCMQ6xGlM8h7jaMddthhq/fy9v798PR46bR2GzZs8MO61xU69J2tC127kFZ17Uw+zpZM+WO2bGRmlwKXAod24l6Nwjtd3YEKyKprl+M5PkeNGpWcu/ba5lnoyJEjgVjQrampKWnjheOGDRsGlBeLW7duHRAH2jlz5gBw0003JW1eeOEFPyyCrlCBtvWkax3Qqq6dGTgrMuVDCBOACWZ2OjC9E/cT+ZBJVwAz69IpXP/+/QE488wzk3MjRowA4L777gPgt7/9LRCTJwOcccYZAPzsZz8D4mAJ8Mc//hGAlStXAtHSXLBgQfX/gPxoV9t60rWe6czm0HJgUOr5QOD91hqHEB7qxL1EfmTSVRQKaVslOjNwvgQMNbMhZrYtcD4wrTrdEl2IdG1cpG2V6PBUPYSwycx+CDwC9AAmhhAWVq1noksooq577NG8ft+3b9/k3BNPPAHEKfq8efMA2HPP6JUyevRoIG4g3Xbbbcm1e++9F4ibQo1AEbWtVzpVrK00/dYUvMGQro2LtK0OqnIpCs/mzZsB2LhxY3Lu7bffBmDJkiVAtDS/+c1vJm1OOOEEAJ599lkAZsyYkVxrJEtTVB9lRxJCiIzI4hSFxy1OX6sE2G+//YDoo3nUUUcBcNlll231+ldeeQUoc2gXok1kcQohREZkcYrC88knnwAxOgjga1/7GgDXX389APvvvz8AAwbEPBW33347APfffz8Aa9eurX1nRUMgi1MIITKigVMIITKiqbooPH/5y18AePHFF5NzY8aMAWDcuHEAfPbZZwA8+eSTSRuPY/d4dCEqRRanEEJkRBanKCzbbNP8u7/TTjsB0KtXr+Sap5rzfJqeFu7Xv/510uapp54qayNEpcjiFEKIjMjiFIXCrUyAQYOaM6Sdf/75AFxyySXJNXc/8jDMl19+uewRouO8EFmRxSmEEBnRwCmEEBlpd6puZhOB8cDKEMKI0rm+wF3AYOBt4LwQgsIuCkTRdPV6QP369UvOXXDBBQBcfvnlQMzLCdH9qDtu/OShrRe4O/DAA4Hyz/nPf/4zEDfo0rp4zlR//T777APAtttuu9U9Vq9eDcD06bHizhdffNHRLleVSizOScC4Lc5dC8wMIQwFZpaei2IxCenaqExC2taUdi3OEMLTpULvac4CTiwdTwZmAddUsV+ixhRFV7c0PZ/m2WefnVz7zne+A0QrZvbs2cm1Dz/8EIhVLj1zUjqD0po1a2rV7S6lltq6Hr4xd9VVVwHRJQxg7ty5AGy33XYA7Lvvvsk1tzC9/WGHHQbEiqNpfCNv1qxZybl6sTg7uqveL4TQBBBCaDKzPVtrqHKjhUK6Ni4VaStdK6Pm7kgqN9qY1FpXt2x8fewb3/gGAD/60Y+SNnvvvTcAjzzyCAA333xzcu2tt94C4LrrrgOiZXPIIYckbZYvX17tbhee9nTt3bs3ACeddBIA48ePB2C33XZL2nzrW98CYNOmTUC5Ze8zAc9E5bWh0rqkLdRSnzr899SKju6qrzCz/gClRwX7NgbStXGRtlWkoxbnNOAi4Belx/ur1qOc8d28Pn36lD2meeedd4C4U9vA1I2uu+++OxB3zv/xH/+x7DzAPffcA8Qwytdffz255hamr436Dm/agfcDBRcAAAj7SURBVL6bURVtfT351FNPBaIFms5lunTpUiBWFvXvD8RaUB988EHZ+/7qV79Kjt1zwl9XL+uaadr9LzKzKcDzwEFmttzMLqH5wx9jZm8AY0rPRYGQro2LtK09leyqX9DKpVOq3BeRI9K1cZG2tadbxqq7mwREdxUvG+slYz/66KOkjbtcLFiwIK8udkvSTtCuw8UXXwxE95Vbb701aeNTdS+dceihhybX3FXpxBNPBGDRokUAfPnllzXoeffBP2t3EVq/fj0A7733XtLGyy17m7/+9a/JNc8PsOuuuwJw0UUXAeWbQ8uWLQPgf/7nf4D6LNXcbRd8hBCio3Qri9NdXLxkLMDPfvYzAM455xwgLnLffffdSRv/lRW1wTdshg8fnpzzGYCH9HnY3dNPP520OfnkkwEYNWoUEN2TIFowbuH4xpFvToiO4WGQEyZMAOLmUKXBBL5JN2LECAAuvbTZZdRdlwBeeOEFAB5++OEq9Lg2yOIUQoiMdAuLc8uwPbcutzwGePfddwGYMWNGck5WSm3xtU13qgYYO3YsENejjzzySACOOOKIpM1+++0HxHVLz70JUTOvMXTnnXcCcf1MdA7/zCuxNP37B9C/f38gzig8b+qbb76ZtHnggQeA+nb/k8UphBAZ0cAphBAZadipenp64JEIF154IQDf+973kmtbbvx41hefGgLcf39hA6MKgccip12FfLOgR48eQNz4+fjjj5M2Pu328sDp6Z5vLPimUj26tHQXevbsmRz7ppBHhLku6Y2gdP7NekUWpxBCZKThLE63NPfaa6/knDtD//jHPwbKLZvHH38ciBaNW5xyQcoPj0VOb9y4S4rr4ZmM3Lk6fbxkyRKg3BqVo3vX425mnoMT4Otf/zoQcw54PPv//u//Jm3SDvP1iixOIYTISMNYnO5YO3DgQCCWjIWYWcfXzW6//fbkmls27gjv6zELFy6scY+F42uc6XWuenZ+FpXRq1cvAE45JYbI++zPZwcelumWZ1GQxSmEEBmppMrlIOAPwF7AZmBCCOHmeqmI6A7SHkbpSSE8eQBEx2hPEPHb3/42ueavcydst1zTiUDawl/nj0VZG613XUXHqAddfdZ28MEHA+VBJl7z6aWXXgLg//7v/4BirGumqcTi3AT8JIQwDBgFXG5mw1HVvKIjXRsT6ZoD7Q6cIYSmEMLc0vF6YDEwgOaqeZNLzSYDZ7f8DqIeka6NiXTNh0ybQ6WSoyOB2WSoiFgt3NUoXYr0qKOOAuCyyy4D4LTTTitrCzBz5kwA7rvvPqDcVcVjZd1lwmNvfcoO0Wk3fc7xvIK+8fTcc88BMStPEehqXUVtyFPX9PfNXQE94MRzogKsXNlc6mjq1KkAvPrqq9XsRm5UPHCaWS9gKnBlCOHj9AfVzutUbrSOka6NiXStLRUNnGbWk2YR7ggh3Fs6vcLM+pd+vVqtmlfNMrK+sOx5GAGuvvpqAEaPHg3AunXrgPIi9pMmTQKiC8QZZ5yRXPNNJM+04xmUfv7znydtPv/8cyCG/6U3gHyRe+LEiUDhLM260FVUl67Qdfvtt0+OPVz5vPPOA8qzHHn48pQpU7a6ViQqKdZmwO3A4hDCTalLXjUPCl7psjsiXRsT6ZoPlVicxwHfBeabmXupXkdzlby7SxX03gXOrU0X45qmZ/r+h3/4h+Sa16bxZAG+xuhuDhDdIq644oqy10D8pXRL0V2XvJYKxNyO7iw/f/785Nqf/vQnoDzcryB0ua6iJnSJrgMGDEiO//7v/x6AnXfeGYDHHnssuXbzzTcDMTFLUamkyuWzQGsLJKqaV1Cka2MiXfNBkUNCCJGRuo1VT+8CusvRjTfeWPY8jW8cnX12s3uaZ2GBOP32TR13iYAYT+uvf/DBBwH453/+56SNF6gSQpTj35tjjz02OTd+/HggLnF52RKIRfOKjixOIYTISN1anJ7LD2LuPl+AdvcgiNakb874JlG6iJT/8r322mtALAEMcPrppwOxUNiqVasAWZlCtIXnZjj88MOBmPUI4sbqo48+CsQN1EZCFqcQQmSkbi3OdLaUV155BYBbbrkFgMMOOyy51tTUBMCLL74IwPvvvw+U15/Zsryv/0pCtDiFEJXjocYeTJIOSlmwYAEAv//974FYcruRkMUphBAZqVuLM41bj9dff30X90SI7o0Hoxx33HEAjBkzBoANGzYkbTwMOV1DqtGQxSmEEBnRwCmEEBkpxFQ9D9zFqSilL4ToCjzX5ne/+10gZkJKFzecPLk5X7JnKmtEZHEKIURGzEuz5nIzs1XABqCI3uW70/l+7xtC2KManaknpKt0rUNqqmuuAyeAmc0JIRyZ602rQFH7nRdF/XyK2u+8KOrnU+t+a6ouhBAZ0cAphBAZ6YqBc0IX3LMaFLXfeVHUz6eo/c6Lon4+Ne137mucQghRdDRVF0KIjOQ2cJrZODNbYmbLzOzavO6bFTMbZGZPmtliM1toZleUzvc1s8fM7I3SY5+u7mu9UARtpWt2pGsb981jqm5mPYClwBhgOfAScEEIYVHNb56RUs3p/iGEuWbWG3gZOBu4GFgTQvhF6Z+oTwjhmi7sal1QFG2lazaka9vkZXEeDSwLIbwVQvgCuBM4K6d7ZyKE0BRCmFs6Xg8sBgbQ3N/JpWaTaRZHFERb6ZoZ6doGeQ2cA4D3Us+Xl87VNWY2GBgJzAb6hRCaoFksYM+u61ldUThtpWtFSNc2yGvgbKnOc11v55tZL2AqcGUI4eOu7k8dUyhtpWvFSNc2yGvgXA4MSj0fCLyf070zY2Y9aRbhjhDCvaXTK0rrKb6usrK113czCqOtdM2EdG2DvAbOl4ChZjbEzLYFzgem5XTvTFhzQffbgcUhhJtSl6YBF5WOLwLuz7tvdUohtJWumZGubd03Lwd4Mzsd+DXQA5gYQrghlxtnxMyOB54B5gObS6evo3nd5G5gH+Bd4NwQwpoW36SbUQRtpWt2pGsb91XkkBBCZEORQ0IIkRENnEIIkRENnEIIkRENnEIIkRENnEIIkRENnEIIkRENnEIIkRENnEIIkZH/B6cpYMGSE3akAAAAAElFTkSuQmCC\n",
|
| 238 |
-
"text/plain": [
|
| 239 |
-
"<Figure size 432x288 with 9 Axes>"
|
| 240 |
-
]
|
| 241 |
-
},
|
| 242 |
-
"metadata": {
|
| 243 |
-
"needs_background": "light"
|
| 244 |
-
},
|
| 245 |
-
"output_type": "display_data"
|
| 246 |
-
}
|
| 247 |
-
],
|
| 248 |
-
"source": [
|
| 249 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 250 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 251 |
-
"from matplotlib import pyplot\n",
|
| 252 |
-
"from tensorflow.keras import backend as K\n",
|
| 253 |
-
"\n",
|
| 254 |
-
"# Load data\n",
|
| 255 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 256 |
-
"\n",
|
| 257 |
-
"# Reshape our data to be in the forma [samples, width, height, color_depth]\n",
|
| 258 |
-
"x_train = x_train.reshape(x_train.shape[0], 28, 28, 1)\n",
|
| 259 |
-
"x_test = x_test.reshape(x_test.shape[0], 28, 28, 1)\n",
|
| 260 |
-
"\n",
|
| 261 |
-
"# Change datatype to float32\n",
|
| 262 |
-
"x_train = x_train.astype('float32')\n",
|
| 263 |
-
"x_test = x_test.astype('float32')\n",
|
| 264 |
-
"\n",
|
| 265 |
-
"# define data preparation\n",
|
| 266 |
-
"train_datagen = ImageDataGenerator(width_shift_range=0.3,\n",
|
| 267 |
-
" height_shift_range=0.3)\n",
|
| 268 |
-
"\n",
|
| 269 |
-
"# fit parameters from data\n",
|
| 270 |
-
"train_datagen.fit(x_train)\n",
|
| 271 |
-
"\n",
|
| 272 |
-
"# configure batch size and retrieve one batch of images\n",
|
| 273 |
-
"for x_batch, y_batch in train_datagen.flow(x_train, y_train, batch_size=9):\n",
|
| 274 |
-
" # create a grid of 3x3 images\n",
|
| 275 |
-
" for i in range(0, 9):\n",
|
| 276 |
-
" pyplot.subplot(330 + 1 + i)\n",
|
| 277 |
-
" pyplot.imshow(x_batch[i].reshape(28, 28), cmap=pyplot.get_cmap('gray'))# show the plot\n",
|
| 278 |
-
" pyplot.show()\n",
|
| 279 |
-
" break"
|
| 280 |
-
]
|
| 281 |
-
},
|
| 282 |
-
{
|
| 283 |
-
"cell_type": "markdown",
|
| 284 |
-
"metadata": {},
|
| 285 |
-
"source": [
|
| 286 |
-
"### Applying all at once"
|
| 287 |
-
]
|
| 288 |
-
},
|
| 289 |
-
{
|
| 290 |
-
"cell_type": "code",
|
| 291 |
-
"execution_count": 6,
|
| 292 |
-
"metadata": {},
|
| 293 |
-
"outputs": [
|
| 294 |
-
{
|
| 295 |
-
"data": {
|
| 296 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAU4AAAD7CAYAAAAFI30bAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO2de7hd453HP68QoXFJRCKSEJckEokRdQ0qLiFKJ7SYxHRKS7VKyzB1mw5tH1TrGdMq7UwMEqoIQTMlgjTuRCQUEZK4hJCLuMZd5J0/9v6u993nutc+e699Ob/P85xn77PW2nu9Z/3OXvt3/znvPYZhGEbxrFPtBRiGYdQbduM0DMNIid04DcMwUmI3TsMwjJTYjdMwDCMlduM0DMNISYdunM65sc65F51zi51z55RrUUZ1Mbk2Libb8uBKzeN0znUBFgJjgKXAHGCC9/758i3PyBqTa+Nisi0f63bgtbsDi733LwM4524CxgGtCsE519mz7Vd57zev9iLaoablus46wUjq0qULAF27dgVg3XXDv/MXX3wBwMcff5zFsupBrpBStvZ5bV2uHblx9gNej35fCuzRgffrDCyp9gKKoKbl+pWvfCV53r17dwC22WYbADbbbLNk35tvvgnA3Llzs1hWPcgValy2NUircu3IjdO1sK3ZN5Rz7iTgpA6cx8iWmpSrborf//73k23HHHNMwTGbbrpp8rxv374AzJkzB4CZM2cC8OqrrybHPPTQQwC8/PLL5V9wbdKubEuVa48ePZo9/+CDDwBYtWpV2rereTpy41wKDIh+7w+82fQg7/1EYCKY6l8nmFwbl3Zla3Itjo5E1ecAg5xz2zjnugLjgWnlWZZRRUyujYvJtkyUrHF679c4504FZgBdgGu89/PLtrIWUGBAQQEIQQCjPFRDri0hGW+//fYAnHjiiQB897vfTY6Rae5cSxZojlGjRgEwePBgoPD/5a677gJgwYIFALzxxhvJvnnz5gHwyiuvALB27dpS/5SaoRKyHTZsGAAnn3xysm3nnXcG4MUXXwRg+vTpyb6pU6d25HQ1Q0dMdbz3dwF3lWktRo1gcm1cTLbloUM3zqzYcMMNAdh3330BmDBhQrJv0KBBAJx22mkAPPvsswB89tlnWS7RKANKKwIYOnQoAD/60Y8AOPbYY4HCqHoxOciyUjbfvHlWibRYvc/ixYuTfX/9618BuOiiiwB49913i/wrOhdDhgwBYIcddki29e/fHwja/o477pjs22uvvQC47bbbgPB5/fzzz5Nj6uGzayWXhmEYKSm5cqikk6WI0sV+qz59+gBw1VVXAXDYYYc1O/7hhx8G4Pzzzwfg8ccfT/Z9+umnJay2Isz13u9a7UWUm3JFX8eMGZM8P/PMMwHYY49cmqF8jF9++WVyTJy3CfD222/HawKgZ8+eRZ8//iwsXLgQgH//938H4Pbbb2/rpZ1Wruuvvz4A6623XrJN1t9JJ+WymgYMCIH8d955B4DVq1cDIT3spptuSo5RGpN8o++9915pf0DHaVWupnEahmGkpGZ9nBtssEHyXD6SsWPHNjtOmoX8oAMHDgTg/vvvL9tadA75y+IIq81s6jiSnbQ7CJqmrv0NN9wAFPohP/zwQwCWL18OFCbAq5pIGqqiv/vvv39yzCabbFKwjtjK0euPPPJIAJ566imgMIHeCP7I2Ed5+eWXA/D887lKzl/84hfJvhEjRgDBWtDndfTo0ckxqvaSpvm3v/2t2fmqjWmchmEYKbEbp2EYRkpq1lSX0xmCiR53v2nKk08+CQTzoKPEZpuc20qlWLRoUbLvmWeeAWDNmjVlOW9nQjLee++9gcKUFpnfkyZNAlo21VtynTRFclTazOmnn57sUzK9Ahux20WpUfvssw8QGorE/xfmpgnE10KBn1mzZjXb96//+q9ASFVq6TMtWX3rW98CQhECwAsvvFDOZZeMaZyGYRgpqVmN86OPPkqeK3lZqQxxisnSpUuBoAUqjaSjxM5qJWGPGzcOgNmzZyf7fv7znwMhGBWnyxhtowDMP/7jPwKFjv+rr74agD/84Q9AkH1MMWWQ0nb0f/LYY48l+5Qu01LJpl63xRZbAHDKKacAhaWFRtsouHPHHXck25q2AoxTlZoeo8/bgw8+mOxTGWe1tX3TOA3DMFJSsxpn7O/aeOONgcJyu6YofalcybKffPJJ8rxph/Hhw4e3us80zuKRJn/EEUcAsNFGGyX75CeL01w6gv6Hdtlll2SbtJa2tJdu3boBIT1KPk8IRRdG8dx6661A8FVeeOGFABxyyCHNjlW62NFHH51s0zWvdg9V0zgNwzBSYjdOwzCMlLRrqjvnrgEOB1Z674fnt/UEbgYGAq8Cx3jvy9o+JjbR1JlGZnFsWskEe+mll7TeZsekQSkucdqLAgQKRsQug8suuwyA3/zmNwBMmTIFKAxu1SLVkmvM+++/D4SKH1UQQXCHxMPZOoLee6eddkr1Ov0f6f9L1Uq1TC3ItjXUN+K5554Dgqneu3fv5JiRI0cCIU0s7k2hfp5KUapWkKiY/8pJQNNax3OAmd77QcDM/O9GfTEJk2ujMgmTbUVpV+P03j/onBvYZPM4YHT++WTgfuDsMq6rINFVI15XrFgBBA0QggYgR7JSleJOOWmQVhmnqFx33XVAqHdWugSEINb3vvc9IDit4xSKaqdOtES15Bpf1wceeAAIqT7xdZKWoTQvJcCXihLq48IKna+tDvLqGL/nnnsCVe3UUzTVkm0apHk++uijAEybFiZ4yNqQxhkXnOj6V/szVWpUvY/3fhmA936Zc653awfalMu6wuTauBQlW5NrcVQ8HanUqXlx2sjuu+8OhLGj8beN0oAOOOAAICQ6q7M0hNQiJVjHqUatpbvEmoVGy06cOBGAU089Ndmnb0VpuDpHtb8RK02pco2vi5KfpdXFmp+KHpQGpI7hsezSIF+pZAlBi9Sa4rJZzWWXr3vLLbcEGr87UtZTLnVd42KGuLcnFPZCffrppyu9pKIo1fO+wjnXFyD/uLJ8SzKqiMm1cTHZlpFSNc5pwHHAJfnHv5RtRXlU6A9BW5B/KtZa5NM8/PDDgVC+F/uypIXOmTMHKCwBay2hNp6GKH/r3XffDYSoH8B+++0HwMEHHwzAn//8Z6Cw2Yg6WtcBFZdrjK75E088AQSrAULiua61Gr3EGmOa6yo/eTzJUq9XZkZsZdx8881A8L316tWr6HPVKJnKtjX0WVaGjLr+jx8/PjlGVsWdd94JFMq8VjT+djVO59yNwGPAEOfcUufcCeQu/hjn3CJgTP53o44wuTYuJtvKU0xUfUIruw4s81qMDDG5Ni4m28pTc7XqUuVjU1d1yzKf4x5+MttlZilIEw/y6tevHxDSmHbeeedkn0w3pUXcd999QOjCEr/366+/XrDG+LlSJtTFR6ah0TpynfzpT38CQr9TCL0HJKuf/OQnzV4vM7qY/gAK/MQBKMlKNfLxCGCZh/q/MEondpt95zvfAUIP1gMPzN3L33rrreQYDVxUQFCutlrCSi4NwzBSUnMap9ISVq4MQb/58+cDIUUpHrKlEiwlSEs7Vb8/CAO39t13XyAMiIqP0zfgiSeeCBRqGk210LjkUom82qbk3dihbbSMrp2uqzR6CBaDrqtkF/dvVPHDdtttV/A+EAoQpI3KMoiDjrJERBxs0sAwIz2yFiSXOOh3/PHHAyHdrEuXLkAoIIFgvdVy2bJpnIZhGCmpOY1TxBqjShyVNhKnI0mz0MwhldapnA/C3BqVTH77299O9u22224F59Nj3B9QidIi9p9Ko5EfZ9WqVc2O0TajEFkX0vRiv3bTZhzyTW677bbJtunTpwPBTx0nRyvdRe85YUIuXhKPmJavXJpNXObbUsd5o3Xicd4qWJk8eTIQUssgzOjS5ASlF6npBxTX2b/amMZpGIaRErtxGoZhpKTmTHU58ePqnBEjRhQcE5tUUvWLMa1ktqmHJkD//v0B6Nu3LxC6HalGGmDQoEFAMMvjqqIlS5YAoZ72nnvuAeqji06toLrwODikwFFs5kGhm0bmuwJGkiXAN77xjYLXKdVI/Q7i91Jf0Hnz5nXgr+jc9OnTJ3l+9tmFTZc0eA/C50SutXrFNE7DMIyU1JzGqWBNHMCR9ikNI05TUHeVNEO94mCNniuwoICD0iQgaJzqVC7tFEJncAUoSu0D2pmRdnnVVVcl2xTc0aAuBQjjZGppoy11omq6LdY0hfq7ygK55ZZbSvsDOjG6rhrlC3DooYcC4bNxxRVXJPt0zdMQFy3UStcx0zgNwzBSUnMap7odxSVY+saRZqKUI4CnnnqqIuuIy/hin6pROaShAJx11llAKLuTz1mFCgAHHXQQ0PbYaCFNRQUSAL/85S+B0OG/1F6fnRF9JmUB/PCHP2x2jEpZL7/88mTbpZdeCoSOWMUwdOjQ5LmsPZXrVqvzmGmchmEYKakZjVNTCOVbjH2M0gSkacbJshYJbUzUlEP9UvV47bXXJseoF6pK+lTiB0Ez0fREabPx7CI9t4Ys6ZEGryKCuNxVmSm69nHRgT67ymTQxIQ4U0Wl1aNGjQIKm/IozjB69Ogy/SWlUUw/zgHOuVnOuQXOufnOudPy23s65+51zi3KPzb3vhs1i8m1MTG5ZkMxpvoa4Ezv/VBgT+AU59wwbNxovWNybUxMrhlQTCPjZYCm4612zi0A+lHmcaMyl9SVKFbdlUyugJFa6hulk5Vcy01c6KBkagUK4m5HCu7JRFegIk4Xq+XuO6WStVyVXqQxzhDGyKh+PR6nrRp1pS+pC1pc665gkNLN4uCsgksy8atFKh9nflbzSGA2Nm60YTC5NiYm18rhik0odc51Bx4ALvLe3+ace897v2m0/13vfZt+kyzGjdY4c733u1Z7ETEm17LQ6eUal8aecMIJAJxzTs4bEJfCtkY8hE1aqIYqaiw3ZF5g0qpci0pHcs6tB0wFbvDe35bfbONG6xyTa2Nicq087ZrqLpfpejWwwHt/WbSrJsaNGqVhcm1MqiVXFadAmH6gFEOlJ0FovqI5Qmq+cv311yfHqPxZqUtKTaslivFx7g38C/Csc06dYs8jJ4Ap+dGjrwFHV2aJRoUwuTYmJtcMKCaq/jDgWtlt40brFJNrY2JyzYaig0NlOZkFEWouiFAOTK4m17aI08TUxUw9KWSqxz0ElGpUAyM0OhYcMgzDMAI1U6tuGEZjol61MXH3s3rENE7DMIyU2I3TMAwjJXbjNAzDSIndOA3DMFJiN07DMIyU2I3TMAwjJXbjNAzDSIndOA3DMFKSdQL8KuCj/GO90YuOr3vrciykBjG5NiYm11bItFYdwDn3ZD3W9dbrurOiXq9Pva47K+r1+lR63WaqG4ZhpMRunIZhGCmpxo1zYvuH1CT1uu6sqNfrU6/rzop6vT4VXXfmPk7DMIx6x0x1wzCMlNiN0zAMIyWZ3Tidc2Odcy865xY7587J6rxpcc4NcM7Ncs4tcM7Nd86dlt/e0zl3r3NuUf6xzZnUnYl6kK3JNT0m1zbOm4WP0znXBVgIjAGWAnOACd775yt+8pTkZ0739d7Pc85tBMwFjgCOB97x3l+S/yfq4b0/u4pLrQnqRbYm13SYXNsmK41zd2Cx9/5l7/3nwE3AuIzOnQrv/TLv/bz889XAAqAfufVOzh82mZxwjDqRrck1NSbXNujQjTOFKt8PeD36fWl+W03jnBsIjARmA32898sgJyygd/VWVllSmmh1J9vOKldo7M9slnIt+caZV+WvBA4FhgETnHPDWju8hW01nQflnOsOTAVO995/UO31ZEVKuUKdybazyhUa+zObuVy99yX9AHsBM6LfzwXObetYche+pn+6devmu3Xr5tdff/1mP2V4/7dKvd5Z/aSRa3R81eVW5Z+al2uJn9lqX9dq/7Qq1450R2pJld+j6UHOuZOAk4ARHThXZmy//fYAfP7558m2tWvXArB48eKOvv2Sjr5BBqSVq1EfcoUiZGtyLaBVuXbkxlmUKu+9nwhMdM59HbizA+erCF/5ylcAOPjggwEYO3YsAIMHD06OWblyJQAXXHABAC+88EKWS8yaVHIFcM4122/UJO3K1uRaHB0JDi0FBkS/9wfebO1g7/1dHTiXkR2p5GrUFSbbMtGRG+ccYJBzbhvnXFdgPDCtPMsyqojJtXEx2ZaJkk117/0a59yp5II+XYBrvPfzy7ayCtK3b9/k+SGHHAIEE12Pm2yySXLMkiU5V4fM94ULFyb75P9sFOpZrkbbmGzLR4dGZ+TNbzPBGwyTa+Nisi0PWc8cqio77LADEAJB0FzTdC7nP1+9enVyzKxZs4CgaeoYwzA6J9YdyTAMIyWdQuP82te+BsABBxwAwJgxY5J9o0aNKjj2zTdzQcb77rsv2TZtWs5//vrruRS4L7/8snKLNQyj5jGN0zAMIyUNq3Gec07oX/Ctb30LgO7duwPB1xmjpHZpmnfdFfzn06dPr9g6DcOoP0zjNAzDSIndOA3DMFLSMKb65ptvDsDPfvYzAI466qhk32abbQbA+uuv3+x1TzzxBBBM9DvvzJXTP/roo5VbrFFWevXqBYS+AxACeY1WoGDUBqZxGoZhpKSuNM511ml+nx8+fDgQNM399tsPCFomQJcuXQpeM3PmzOT5PffcAwRNc/58q0CrBVRk4IuYiXXssccWPALsuOOOADz33HMA3HHHHQD87//+b3LM22+/XZ7FGp0O0zgNwzBSUhcap7TKAQNyHbFiX+UZZ5wBBA2jZ8+ezV6v8kn5MWfMmJHsk6a5dOnSci/bSMl6662XPJfPettttwXgrbfeAmDFihXJMSNHjgTgkksuAQoti65duwKwxx65Pr2yVu6+++7kGNM4jVIxjdMwDCMlduM0DMNISbumunPuGuBwYKX3fnh+W0/gZmAg8CpwjPf+3XIuTN2KAC699FIANt10U6Cwc5FSUZqa6LHprWCQzDSZ503fqzNRLbm2hMzo3XffPdl28cUXA0G+s2fPBuCZZ55JjjnhhBOAYL4/9thjyT5t+8EPfgDAdtttB3SOzla1JNtGpRiNcxIwtsm2c4CZ3vtBwMz870Z9MQmTa6MyCZNtRWlX4/TeP5gf9B4zDhidfz4ZuB84uxwLGjgwd6o4tWTQoEFACArFSc1NU5SWL18OFHY3mjJlClBczbmSqD/99NNkm1Kb1BVJNe8A77//PgAbbrghEAa7xaxZs6bd82ZN1nJtCV3rr371q0ChzPfee28gaIh9+vQBYLfddkuOkawvuugioLAz/6GHHgrABx/kRmz37t0bgGXLlpX5r6g9akG2lUbB4K222goIhSyQTdCv1Kh6H+/9MgDv/TLnXO/WDrRxo3WFybVxKUq2JtfiqHg6Utpxo6tWrQIKU1Oalkq2lAj/xRdfFLz+9ttvT/bJx7nRRhsB0KNHj2TfuuvmLsE222wDBI1XjxA03o8++ggo1Hi7desGhBK/l19+GSjUeNXjM57VXu+UOkZW1xtCn1T5M+WHhCDjq6++GgjX+cMPP0yOke9b1zz2X8oSaFpy2Vl92sVS6+OBZZ385Cc/AWD//fcHgrwhpKlV0tIrNaq+wjnXFyD/2Nw+NeoRk2vjYrItI6VqnNOA44BL8o9/KdeC5LOQzxDCN0esrTRF/kf5OuIu7YcddhgQNJo999wz2Sdtdueddwbgs88+A4IGGp9f5X+xZqNEa2lCOuY///M/k2MUxX/yySdbXX+NUDG5yoJQQjrAiSeeCIRrH/umzjrrLABuvfVWIPw/xBqjtHwRl2fqWm+wwQZA8FlLGwF45JFHSv576pCKybbSfOc730meS37xNigssf7+978PwB//+MeKraldjdM5dyPwGDDEObfUOXcCuYs/xjm3CBiT/92oI0yujYvJtvIUE1Wf0MquA8u8FiNDTK6Ni8m28tRcrfo777wDhIAKtG2iCwUPRo8eDcC4ceOSfUqGVwpDbNLJhFQQoaXAk8xx0VKQJ05RAjjzzDOT5zL/68BUrxg77bQTAOedd16yTZ2sJI943IlM9Pfee6+k87366qtAKJp4991crnfspulkpnrdMXjwYAAOP/zwZNvRRx/d4rGxa08yryRWcmkYhpGSmtM4pZ3FicoKAqg7Uluom06sVcaOY2g5TUGJ0gomSGOB0HXn2WefBWCTTTZJ9ilQpLQIEadQKTilLj6dkX322QcIiekx0vZVqABBHqWy8cYbF/yu4FDcXcmoTbbYYgsgjPOOrcfWiAOFGrwo67ESUwBM4zQMw0hJzWmc8kXFCeRHHHEEEDS/LbfcMtmnb5OmzRvidCT5SPWtFJdTqjGENBxplW+88UZyjJLqlWQba7N9+/YFQgL+wQcfDBT6ReNk/s7GP/zDPwBw0EEHAYXXTtf8uuuuK/i9HDS1PFTeGVst2qbCBqN6xFacrJPzzz8faB5jgPB5kwVxwQUXJPvUEKaS86ZM4zQMw0iJ3TgNwzBSUnOmuojH8yq1R4EFpSlACPzINHvxxReBwgDQokWLgOA0Vm0zwMMPPwyEoJBGNMSmf1s1r9on87+lKic5uzsju+yyCwC77rorUHhdZYpde+21ZT/v9ttvX3A+yUVul3ibUT0URB01alSyTf8XcoPFqEJPQ/h+85vfAIX3iywCgKZxGoZhpKRmNc6YWbNmAWF0b+wsVhBIAaO///3vAOywww7JMa+99hoQAgVxvbM0kri7SnvEwR4FG5T+0lICfWcMPqgmXalYSkqOHfaXX355xc6vvgSSuTrHaxw0hNQ3o3rIijzttNOSbSNGjCg4RoEgCNbjySefDATNM2vrwTROwzCMlNSFxila6q4ujVG+jyFDhgDwyiuvJMdI6/j444+bvT5OjymWeAyturVoXo40zvhbsjOWWh5//PFA0ChUEhsXFpQbdYmH0HGp6f/HkiVLKnZ+o2Xiz4t64qpr/1577QUUdq1qiuIWEMpyn3766bKvMw2mcRqGYaSkrjTOllD0Wr7G008/HShMplY3+Oeffx4IjURKJW4UoeT8pr7NuDnF4sWLO3S+ekQJ78pWkAa+YMGCsp9LkdkxY8Yk2yQj+VTjzvFGtkirBDjjjDOAMFNKs6Ba4qabbgLgrrvuSrbFPupqUkw/zgHOuVnOuQXOufnOudPy23s65+51zi3KP/Zo772M2sHk2piYXLOhGFN9DXCm934osCdwinNuGDZutN4xuTYmJtcMKKaR8TJA0/FWO+cWAP2okXGjMgGVAH/ggblerfGwtV69egHwX//1X0BhkCiuW28N9fpTX0CdA2DrrbcGQvBDKLEeYNKkSe3/IRlTabnGg9cgFB8oubmcKHl6woTQv7dfv35ASFf76U9/Wvbz1iK1+HmNuxvJtdW0t0T8mfzv//5vAP785z8DMHfu3EovMTWpfJz5Wc0jgdnYuNGGweTamJhcK0fRN07nXHdgKnC69/6Dpt8YrZHVuFGl/CxcuBAoHLamwU5KZ1L3FIDHH38cCF3dFeSJu0iPHz8egN122w0odHbHo4YhdHW68cYbk22ffPJJaX9UBlRKrnofpXvp2ne0A1Lc6Vv9GjUqVh3lIRQpKBilKQCdhVr4vKrMNu7g3nQdSjW68sork20KAMVpSLVGUelIzrn1yAnhBu/9bfnNNm60zjG5NiYm18rTrsbpcl8RVwMLvPeXRbtqatyovsnUMEI+LoDhw4cDcOqppwLBzwIhIVtlmOo8rxk58XvJnxmnHindRalO0jilyUJhY4laoRJylZ+5yXmAUCbb1vwo+aIh+K4lF13n2F8mTVOaTZxoLQ0znmPUGaiFz6v6nKqvpj43MSqDlv//97//faWWUxGKMdX3Bv4FeNY5p3T988gJYEp+9OhrQMtTlIxaxeTamJhcM6CYqPrDQGsOEhs3WqeYXBsTk2s21H3lkJCJPWfOHKCw845G0qq6SL0a4+fqlCNTMu62Eg9ei48FePvtt4EwgkOmR2ccChYHbjSCRL0DVMkj8xpCJ6vly5cDhSadrrG6LKnbVTxiQSlP6qEaB+FmzJgBFAYCjWxQN7ChQ4cCoXoMQnqYZH/ppZdmvLryYLXqhmEYKWkYjVOou3uc9qCUmBNOOAGAzTffPNknrUXBC70uDjSIN998EyhMmpeG+atf/Qro3F3F5fAH+PWvfw3AxRdfDMBWW20FwJFHHpkco5HKSgWLB6mpv6k0T2n9sVzVIUepX7fddluyT+lPnbEXarVR8FTDEWMLTZ8rBf1k4cXDFesB0zgNwzBS4krpR1nyySqYAN/CuZLn0l7kb9MjwOjRo4HgO9M8o/j1L730EhCS7NWJHkLnpSKZ673fNc0L6oG25CpfllLC4lQu/e/pWsfJ8ZKZkAyuv/76ZNtTTz0FwLRp00pffHnodHItBo2Gnjp1arJNfujLLstlSk2ZMgWoWcugVbmaxmkYhpGShtU42yLWZpRY3b9/fwC6d+8OFEZv1U9TSdnydZZAp9NMdB133HHHgkeAYcOGFWyLI+C65vKNqlAh1jhriE4n1zTEU2mVeSH/p7JSahTTOA3DMMqF3TgNwzBS0ilN9SpiJl2RqN5Zw70UOGpp4F4NYHJtTMxUNwzDKBdZJ8CvAj7KP9Ybvej4upu3iWkMyi5XpadkkKZicm0d+7y2QqamOoBz7sl6NGvqdd1ZUa/Xp17XnRX1en0qvW4z1Q3DMFJiN07DMIyUVOPGObEK5ywH9brurKjX61Ov686Ker0+FV135j5OwzCMesdMdcMwjJTYjdMwDCMlmd04nXNjnXMvOucWO+dqdvSgc26Ac26Wc26Bc26+c+60/Paezrl7nXOL8o892nuvzkI9yNbkmh6TaxvnzcLH6ZzrAiwExgBLgTnABO/98xU/eUryM6f7eu/nOec2AuYCRwDHA+947y/J/xP18N6fXcWl1gT1IluTazpMrm2Tlca5O7DYe/+y9/5z4CZgXDuvqQre+2Xe+3n556uBBUA/cuudnD9sMjnhGHUiW5NrakyubdChG2cKVb4f8Hr0+9L8tprGOTcQGAnMBvp475dBTlhA7+qtrLKkNNHqTradVa7Q2J/ZLAtK8l8AABD7SURBVOVa8o0zr8pfCRwKDAMmOOeGtXZ4C9tqOg/KOdcdmAqc7r3/oL3jG4WUcoU6k21nlSs09mc2c7l670v6AfYCZkS/nwuc29ax5C58Z/55q9TrndVPGrlGx1f7ulb7p+blWuJnttrXtdo/rcq1I92RWlLl92h6kHPuJOAkYEQHztUoLKn2AoogrVyN+pArFCFbk2sBrcq1Iz7OolR57/1En+tScmQLxxu1Ryq5+jrsnNOJaVe2Jtfi6MiNcykwIPq9P9DqFDPv/V0dOJeRHankatQVJtsy0ZEb5xxgkHNuG+dcV2A8UPUB10aHMbk2LibbMlGyj9N7v8Y5dyq5oE8X4Brv/fyyrcyoCibXxsVkWz5sWFu22FCvxsTk2pi0KtesZw5VDE1F3GCDDQDYdNNNk30rV64EwqREwzCMjmDdkQzDMFJSVxpnt27dAOjfv3+yTc8PPvhgAI44onlJqiYl3nzzzQA899xzzV6/cOFCAB577DEAvvjii7Ku3TCMxsE0TsMwjJTUhcY5cOBAAI455hgA/vmf/znZt+222wKw7rq5P2W99dYD4JNPPkmOeffddwE49NBDAfjVr36V7Pv4448LHi+77DIAHnrooeSYTz/9FICnn366HH+OkYJ11sl9t3ft2hUIsjCMamIap2EYRkrsxmkYhpGSmjXV+/Tpkzw/6KCDADj99NMB2GKLLZJ9a9euBWDBggUAzJkzB4D7778/OUa5queddx4QzD+ANWvWFLznz3/+82ZrmTFjBgD33HMPAPfdd1+yb/Hixan+LiMdu+++OwCHHHIIAL169Ur2SY5yy0g+AI888ggQAoOGUU5M4zQMw0hJzVQObbbZZgB87WtfA+C0005L9g0ZMgSA7t27A7Bs2bJkn1KLzj33XACWL18OBE0yfs9vfOMbQGGAQUGgnj17AnD44YcDMG5c8ykB7733HgBXXnllsu3CCy8E4LPPPmvtT4uxCpM2iC2BHXbYAYAbb7wRgKFDh+pcyTEK+vXokZvD9eijjyb7zjjjDADmzZsHBKujQphcG5NW5Woap2EYRkpqxsf5pz/9CYBdd83d4GNf1osvvgjAFVdcAcBf/vKXZJ+0z7fffrvV954+fToQfJXyi8ZIk5k9ezYAt99+e7Jv/PjxAIwdOxYoTIeaPz/XI+HWW28FCjVdIx1KOwM49dRTgWBtPP98brjir3/96+SYV155BYBbbrkFgD32CD15R40aBcDf//53wORilBfTOA3DMFJiN07DMIyUtGuqO+euAQ4HVnrvh+e39QRuBgYCrwLHeO/fLWUBv/zlL4FQay6T6v3330+O+d3vfgcE81kBoLS0ZKILBQ8UbJJ7AKBLly4A7LXXXkBhOtRRRx0FwIMPPgjAm2/WR0PtSsu1FJR6BLDffvsBQQ6///3vgRAsAhgwINfM/O677wbg2GOPTfbJ5RMHnDoLtSjbRqOY/6pJwNgm284BZnrvBwEz878b9cUkTK6NyiRMthWlXY3Te/9gftB7zDhgdP75ZOB+4OxiT6remQAffvghAF9++SUQtMLzzz8/OeaOO+4AStc0SyHujqRg1AEHHACEmnkIqTC9e+fm3deLxlkJuZaKNPodd9wx2bb55psDcP311wMwc+bMZq97/fXcwMbHH38cCEnyAH379gVg8ODBQGFHrEanlmTbHrII4jQzWRKyPpcuXZr9wtqh1Kh6H+/9MgDv/TLnXO/WDrRxo3WFybVxKUq2JtfiqHg6kvd+IjARQkJtXAa35ZZbAqFsTl2K4m7tccJ7NXjnnXcAWLVqFQCff/55sk/9PJWc31loSa6loms4YcKEZJsKEtQfdcWKFa2+XtZB3BFLBRX9+vUDOpfG2RHKKde2kIa58cYbA3DBBRck+0aOHAnAPvvsAxSW0v7P//wPUJiSWA1K9ZyvcM71Bcg/rizfkowqYnJtXEy2ZaRUjXMacBxwSf4x1e0/buChbx75Pddff32gMAFe+6rVsGH77bcHQqlmrF3KHyOttM7pkFzTIt+mshTiBHhp9fKBtdWHU8nxmjcFsMkmmwDBT6ZzyZfeCclUtsWy9957A/D1r3892bbNNtsAQWbqoxs/f+mll4DQW1cFNFB0+XOHaFfjdM7dCDwGDHHOLXXOnUDu4o9xzi0CxuR/N+oIk2vjYrKtPMVE1Se0suvAMq/FyBCTa+Nisq08ValVjwM/r732GhBMXaWhKNkc4KabbgKyMdUVqFDqEcDo0aMB+OY3vwkUdto588wzgWAuGsUjs1lD+NRTFYJp/vLLLwNtdzdatGgRAE8++WSyTZ2wFGiI+7Ma1UdBIX22YveXxt8o6KeuZBD+V7beemsAfvrTnwKhAxrAvffeCwSZVyKdqfOVVRiGYXSQqmiccdqI0gq++tWvAnDYYYcBoQQTQqqCOuOUs+u6hoBJ01VKjLrNQ0hul7P6uuuuS/b98Y9/LNtaOisbbrghEFLRIGijCvK0hQKJsWYibUVlnBrm14mDQ2VFaYTSFBXkgWA5qDChJfS5U/AvDhiLhx9+GAg9VSFoqrvssgsQ0s3icltZGbI6pkyZkuybNWsW0PFgrmmchmEYKal6P075sDTrR2kGP/7xj5Nj1P9S3dnjb5A//OEPQPgG1Myh2CembzWlNenbDuC4444Dgoar1CMlUAO88MILQPgG/cEPfpD67zTaJ04nkkagkty2kKYZ+8AlY6W7qQTz1VdfLctaOxM777xz8ny77bYD4OijjwbC50VWGYQy5GeeeQaAa665Jtl39dVXA+HzOWjQIKCwX6r2qQgibuyiEd1KS1L5c1wGPWLEiILHeG1vvPEGEPruljoZwDROwzCMlNiN0zAMIyVVN9XVDWnhwoUA3HzzzUBIOwDYdNNNgTC64nvf+16y74c//CEQzGkFBeIuRcOHDwfCmAsNAoNQ+SN0zM9+9rNkm8x/deNpq6+nkR4F5uLKITn926pRFxqb8sADDyTb1ClJHZdUjWKmevFsuOGGDBs2jKlTpybbFJxRQE+mbuwm0WdQo0xkMgOcdFKuf8hVV10FhAF7Cg5DcKvos7nbbrsl+/Q5lxwvvvhioHBk9ymnnFLwnnIvxO/VVuCqGEzjNAzDSEnVNc6mqIuNElshJKVrIJpSlgCOPPJIoHl6xE477ZQcozQipa3Ejmhpk+r7qNQHBa2gYerQaw4FcJSGFPdplQWSRkOMpwZIM1KaS1x0YRRHt27dGDJkSEGAbquttgKCpqniFA3Fg9BnQEnq++67b7JPgaYTTzwRCHKO+3HquT7Le+65Z7JPye0qetC5YtnrfApc6fMPobBCgadSP9umcRqGYaSk5jTOllDJlB7j8sZJkyYBQStVuVacgiB/6cqVuU5ad955Z7JPmslbb70FFCbnG5VF2oKS0+MEeD2Xn0zpIy2hdLO4b6u0V2kY775r43VKYe3atWy77bbJ79I0JQ9pgNdee22z10rjjEthzzjjDCBorm3NhpJFokR2CAUMZ5+da14v/3bsY33iiSeAUIa5ZMmSZJ8+35YAbxiGkTHFTLkcAFwHbAGsBSZ6739Xzal5im7Hz+OyrNaQ7yROeo3L9DoTtSRX9U9UwxcIFoNK66S1tFQyqSyHWItQorVKaKW1aFpm/LpGopxydc7RtWvXgswG9U5VnECaZ6wx6rpK04s1PpVRfvvb3wbgRz/6EVBoIcY+SQj+SIDx48cD4fOu8ufYhz158mQA/vrXvwKFWTQPPfRQW39y0RSjca4BzvTeDwX2BE5xzg3DpubVOybXxsTkmgHt3ji998u89/Pyz1cDC4B+5KbmTc4fNhk4olKLNMqPybUxMblmQ6rgUH7k6EhgNikmItYKpdalNjrVlqu66cQpJTKvRo0aBcD06dOBttOT4pEJcsuobloBDiVuQ3F18PVMR+X62Wef8dJLLxW4QFSYoFQfjbxQYnp7SH6XX345AI888ggAJ598cnLMgQfm+i3HJrrQaB31rZg7dy5QGIBS0FHjxGNXQ7ncM0XfOJ1z3YGpwOne+w/ivKt2XmfjRmsYk2tjUg656iZlNKeoG6dzbj1yQrjBe39bfvMK51zf/LdXq1Pzsho3aqSnVuT6yiuvAPB///d/yTYFhf7pn/4JCGlF//Ef/5EcI40i+nuS5yqx1DZ1xVHqUyNTLrl269bNL1++vCDwqrJYlUFLk5cmCoXB29ZQMEfBotga1BhudYePy6KVFK/RwWeddRYA//Zv/5Yco2CUAomVsDSLGdbmgKuBBd77y6JdmpoHNTQ1zygOk2tjYnLNhmK+fvcG/gV41jn3dH7beeSm5E3JT9B7DTi6Mks0KkTNyFVNIW6//fZk20EHHQTA/vvvD8B3v/tdIDQEAfjtb38LhMT3WJtU8rV49tlngU5R4FA2uX7xxResWLGCadOmJdtUkCAftDrsxyk/xWic8TmgcCaUnut/QClIEJq3KH1JjTzkc4XCculKUcyUy4eB1hwkNjWvTjG5NiYm12ywyiHDMIyUNL6n3Kh5lCISVw5dccUVQDDJhg0bBoSRCRD6LMo0i7teqQeratQ1kiVOWTLaZu3ataxevTrpmQnwt7/9DQjXXqMvVAkEwVQvNkWpNdRjM+61KbmqEkwyv+eeezp0rrSYxmkYhpESl2VSuKUjMdd7v2u1F1FuKiFXJa5/85vfBELn8LgbuFDaSZxAr6T6W265BQj1y/ExZaSh5RoH3dQL98ILLwRCh32liwHMmDEDgIsuuggo7CNRZ/0BWpWraZyGYRgpMY0zWxpaM8mawYMHA2GmTVyiJy1HJXkVptPIVelgsgSUeB6nf0mr11hf+ashlFyqLLLGMY3TMAyjXJjGmS2dRjPpZHRaucr3/OMf/zjZpsi37i2aDwShaEFJ9S2VwsbZEVXGNE7DMIxyYTdOwzCMlJipni2d1qRrcDqtXBWQ22+//ZJtRx11FBB6qWqQIgTTXI8arBYH8TSATXXsjz/+eLIv4x6qZqobhmGUi6w1zreAj4BVmZ20fPSi4+ve2nu/efuH1RcmV5NrDVJRuWZ64wRwzj1Zj2ZNva47K+r1+tTrurOiXq9PpddtprphGEZK7MZpGIaRkmrcOCdW4ZzloF7XnRX1en3qdd1ZUa/Xp6LrztzHaRiGUe+YqW4YhpGSzG6czrmxzrkXnXOLnXPnZHXetDjnBjjnZjnnFjjn5jvnTstv7+mcu9c5tyj/2KPaa60V6kG2Jtf0mFzbOG8WprpzrguwEBgDLAXmABO8989X/OQpyc+c7uu9n+ec2wiYCxwBHA+8472/JP9P1MN7f3YVl1oT1ItsTa7pMLm2TVYa5+7AYu/9y977z4GbgHEZnTsV3vtl3vt5+eergQVAP3LrnZw/bDI54Rh1IluTa2pMrm2Q1Y2zHxAPW16a31bTOOcGAiOB2UAf7/0yyAkL6F29ldUUdSdbk2tRmFzbIKsbZ0tznms6nO+c6w5MBU733n9Q7fXUMHUlW5Nr0Zhc2yCrG+dSYED0e3/gzYzOnRrn3HrkhHCD9/62/OYVeX+K/Corq7W+GqNuZGtyTYXJtQ2yunHOAQY557ZxznUFxgPTMjp3KpxzDrgaWOC9vyzaNQ04Lv/8OOAvWa+tRqkL2ZpcU2Nybeu8WSXAO+e+DvwW6AJc472/KJMTp8Q5tw/wEPAsoFmm55Hzm0wBtgJeA4723r9TlUXWGPUgW5NrekyubZzXKocMwzDSYZVDhmEYKbEbp2EYRkrsxmkYhpESu3EahmGkxG6chmEYKbEbp2EYRkrsxmkYhpESu3EahmGk5P8BtOxyjTXD5toAAAAASUVORK5CYII=\n",
|
| 297 |
-
"text/plain": [
|
| 298 |
-
"<Figure size 432x288 with 9 Axes>"
|
| 299 |
-
]
|
| 300 |
-
},
|
| 301 |
-
"metadata": {
|
| 302 |
-
"needs_background": "light"
|
| 303 |
-
},
|
| 304 |
-
"output_type": "display_data"
|
| 305 |
-
}
|
| 306 |
-
],
|
| 307 |
-
"source": [
|
| 308 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 309 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 310 |
-
"from matplotlib import pyplot\n",
|
| 311 |
-
"from tensorflow.keras import backend as K\n",
|
| 312 |
-
"\n",
|
| 313 |
-
"# Load data\n",
|
| 314 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 315 |
-
"\n",
|
| 316 |
-
"# Reshape our data to be in the forma [samples, width, height, color_depth]\n",
|
| 317 |
-
"x_train = x_train.reshape(x_train.shape[0], 28, 28, 1)\n",
|
| 318 |
-
"x_test = x_test.reshape(x_test.shape[0], 28, 28, 1)\n",
|
| 319 |
-
"\n",
|
| 320 |
-
"# Change datatype to float32\n",
|
| 321 |
-
"x_train = x_train.astype('float32')\n",
|
| 322 |
-
"x_test = x_test.astype('float32')\n",
|
| 323 |
-
"\n",
|
| 324 |
-
"# define data preparation\n",
|
| 325 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 326 |
-
" rotation_range=45,\n",
|
| 327 |
-
" width_shift_range=0.2,\n",
|
| 328 |
-
" height_shift_range=0.2,\n",
|
| 329 |
-
" shear_range=0.2,\n",
|
| 330 |
-
" zoom_range=0.2,\n",
|
| 331 |
-
" horizontal_flip=True,\n",
|
| 332 |
-
" fill_mode='nearest')\n",
|
| 333 |
-
"\n",
|
| 334 |
-
"# fit parameters from data\n",
|
| 335 |
-
"train_datagen.fit(x_train)\n",
|
| 336 |
-
"\n",
|
| 337 |
-
"# configure batch size and retrieve one batch of images\n",
|
| 338 |
-
"for x_batch, y_batch in train_datagen.flow(x_train, y_train, batch_size=9):\n",
|
| 339 |
-
" # create a grid of 3x3 images\n",
|
| 340 |
-
" for i in range(0, 9):\n",
|
| 341 |
-
" pyplot.subplot(330 + 1 + i)\n",
|
| 342 |
-
" pyplot.imshow(x_batch[i].reshape(28, 28), cmap=pyplot.get_cmap('gray'))# show the plot\n",
|
| 343 |
-
" pyplot.show()\n",
|
| 344 |
-
" break"
|
| 345 |
-
]
|
| 346 |
-
},
|
| 347 |
-
{
|
| 348 |
-
"cell_type": "markdown",
|
| 349 |
-
"metadata": {},
|
| 350 |
-
"source": [
|
| 351 |
-
"### Read more about it at the official Keras Documentation\n",
|
| 352 |
-
"https://keras.io/preprocessing/image/"
|
| 353 |
-
]
|
| 354 |
-
},
|
| 355 |
-
{
|
| 356 |
-
"cell_type": "markdown",
|
| 357 |
-
"metadata": {},
|
| 358 |
-
"source": [
|
| 359 |
-
"### Test Augmentation on a single image\n",
|
| 360 |
-
"- Outputs to ./preview diretory"
|
| 361 |
-
]
|
| 362 |
-
},
|
| 363 |
-
{
|
| 364 |
-
"cell_type": "code",
|
| 365 |
-
"execution_count": 7,
|
| 366 |
-
"metadata": {},
|
| 367 |
-
"outputs": [],
|
| 368 |
-
"source": [
|
| 369 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator, img_to_array, load_img\n",
|
| 370 |
-
"\n",
|
| 371 |
-
"datagen = ImageDataGenerator(\n",
|
| 372 |
-
" rotation_range=40,\n",
|
| 373 |
-
" width_shift_range=0.2,\n",
|
| 374 |
-
" height_shift_range=0.2,\n",
|
| 375 |
-
" shear_range=0.2,\n",
|
| 376 |
-
" zoom_range=0.2,\n",
|
| 377 |
-
" horizontal_flip=True,\n",
|
| 378 |
-
" fill_mode='nearest')\n",
|
| 379 |
-
"\n",
|
| 380 |
-
"img = load_img('dog.jpeg') \n",
|
| 381 |
-
"x = img_to_array(img) # creating a Numpy array with shape (3, 150, 150)\n",
|
| 382 |
-
"x = x.reshape((1,) + x.shape) # converting to a Numpy array with shape (1, 3, 150, 150)\n",
|
| 383 |
-
"\n",
|
| 384 |
-
"i = 0\n",
|
| 385 |
-
"for batch in datagen.flow(x, save_to_dir='output', save_prefix='dog', save_format='jpeg'):\n",
|
| 386 |
-
" i += 1\n",
|
| 387 |
-
" if i > 35:\n",
|
| 388 |
-
" break "
|
| 389 |
-
]
|
| 390 |
-
},
|
| 391 |
-
{
|
| 392 |
-
"cell_type": "code",
|
| 393 |
-
"execution_count": null,
|
| 394 |
-
"metadata": {},
|
| 395 |
-
"outputs": [],
|
| 396 |
-
"source": []
|
| 397 |
-
}
|
| 398 |
-
],
|
| 399 |
-
"metadata": {
|
| 400 |
-
"kernelspec": {
|
| 401 |
-
"display_name": "Python 3",
|
| 402 |
-
"language": "python",
|
| 403 |
-
"name": "python3"
|
| 404 |
-
},
|
| 405 |
-
"language_info": {
|
| 406 |
-
"codemirror_mode": {
|
| 407 |
-
"name": "ipython",
|
| 408 |
-
"version": 3
|
| 409 |
-
},
|
| 410 |
-
"file_extension": ".py",
|
| 411 |
-
"mimetype": "text/x-python",
|
| 412 |
-
"name": "python",
|
| 413 |
-
"nbconvert_exporter": "python",
|
| 414 |
-
"pygments_lexer": "ipython3",
|
| 415 |
-
"version": "3.7.4"
|
| 416 |
-
}
|
| 417 |
-
},
|
| 418 |
-
"nbformat": 4,
|
| 419 |
-
"nbformat_minor": 2
|
| 420 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:ddd167c1ab455569d8304594d2622e70dc2ecd8126303dee640445cbe021ac54
|
| 3 |
+
size 88984
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
11. Confusion Matrix and Viewing Misclassifications/11.1 - 11.2 - MNIST Confusion Matrix Analysis and Viewing Misclassifications.ipynb
CHANGED
|
@@ -1,484 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Confusion Matrix Analysis and Viewing Misclassifications"
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "code",
|
| 12 |
-
"execution_count": 1,
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"outputs": [
|
| 15 |
-
{
|
| 16 |
-
"name": "stdout",
|
| 17 |
-
"output_type": "stream",
|
| 18 |
-
"text": [
|
| 19 |
-
"x_train shape: (60000, 28, 28, 1)\n",
|
| 20 |
-
"60000 train samples\n",
|
| 21 |
-
"10000 test samples\n",
|
| 22 |
-
"Number of Classes: 10\n",
|
| 23 |
-
"Model: \"sequential\"\n",
|
| 24 |
-
"_________________________________________________________________\n",
|
| 25 |
-
"Layer (type) Output Shape Param # \n",
|
| 26 |
-
"=================================================================\n",
|
| 27 |
-
"conv2d (Conv2D) (None, 26, 26, 32) 320 \n",
|
| 28 |
-
"_________________________________________________________________\n",
|
| 29 |
-
"conv2d_1 (Conv2D) (None, 24, 24, 64) 18496 \n",
|
| 30 |
-
"_________________________________________________________________\n",
|
| 31 |
-
"max_pooling2d (MaxPooling2D) (None, 12, 12, 64) 0 \n",
|
| 32 |
-
"_________________________________________________________________\n",
|
| 33 |
-
"dropout (Dropout) (None, 12, 12, 64) 0 \n",
|
| 34 |
-
"_________________________________________________________________\n",
|
| 35 |
-
"flatten (Flatten) (None, 9216) 0 \n",
|
| 36 |
-
"_________________________________________________________________\n",
|
| 37 |
-
"dense (Dense) (None, 128) 1179776 \n",
|
| 38 |
-
"_________________________________________________________________\n",
|
| 39 |
-
"dropout_1 (Dropout) (None, 128) 0 \n",
|
| 40 |
-
"_________________________________________________________________\n",
|
| 41 |
-
"dense_1 (Dense) (None, 10) 1290 \n",
|
| 42 |
-
"=================================================================\n",
|
| 43 |
-
"Total params: 1,199,882\n",
|
| 44 |
-
"Trainable params: 1,199,882\n",
|
| 45 |
-
"Non-trainable params: 0\n",
|
| 46 |
-
"_________________________________________________________________\n",
|
| 47 |
-
"None\n",
|
| 48 |
-
"Train on 60000 samples, validate on 10000 samples\n",
|
| 49 |
-
"60000/60000 [==============================] - 117s 2ms/sample - loss: 0.7767 - accuracy: 0.7570 - val_loss: 0.2555 - val_accuracy: 0.9265\n",
|
| 50 |
-
"Test loss: 0.25551080925762654\n",
|
| 51 |
-
"Test accuracy: 0.9265\n"
|
| 52 |
-
]
|
| 53 |
-
}
|
| 54 |
-
],
|
| 55 |
-
"source": [
|
| 56 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 57 |
-
"import tensorflow as tf\n",
|
| 58 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 59 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 60 |
-
"from tensorflow.keras.layers import Dense, Dropout, Flatten\n",
|
| 61 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
|
| 62 |
-
"from tensorflow.keras.optimizers import SGD \n",
|
| 63 |
-
"from tensorflow.keras.utils import to_categorical\n",
|
| 64 |
-
"\n",
|
| 65 |
-
"# Training Parameters\n",
|
| 66 |
-
"batch_size = 64\n",
|
| 67 |
-
"epochs = 1\n",
|
| 68 |
-
"\n",
|
| 69 |
-
"# loads the MNIST dataset\n",
|
| 70 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 71 |
-
"\n",
|
| 72 |
-
"# Lets store the number of rows and columns\n",
|
| 73 |
-
"img_rows = x_train[0].shape[0]\n",
|
| 74 |
-
"img_cols = x_train[1].shape[0]\n",
|
| 75 |
-
"\n",
|
| 76 |
-
"# Getting our date in the right 'shape' needed for Keras\n",
|
| 77 |
-
"# We need to add a 4th dimenion to our date thereby changing our\n",
|
| 78 |
-
"# Our original image shape of (60000,28,28) to (60000,28,28,1)\n",
|
| 79 |
-
"x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n",
|
| 80 |
-
"x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n",
|
| 81 |
-
"\n",
|
| 82 |
-
"# store the shape of a single image \n",
|
| 83 |
-
"input_shape = (img_rows, img_cols, 1)\n",
|
| 84 |
-
"\n",
|
| 85 |
-
"# change our image type to float32 data type\n",
|
| 86 |
-
"x_train = x_train.astype('float32')\n",
|
| 87 |
-
"x_test = x_test.astype('float32')\n",
|
| 88 |
-
"\n",
|
| 89 |
-
"# Normalize our data by changing the range from (0 to 255) to (0 to 1)\n",
|
| 90 |
-
"x_train /= 255\n",
|
| 91 |
-
"x_test /= 255\n",
|
| 92 |
-
"\n",
|
| 93 |
-
"print('x_train shape:', x_train.shape)\n",
|
| 94 |
-
"print(x_train.shape[0], 'train samples')\n",
|
| 95 |
-
"print(x_test.shape[0], 'test samples')\n",
|
| 96 |
-
"\n",
|
| 97 |
-
"# Now we one hot encode outputs\n",
|
| 98 |
-
"y_train = to_categorical(y_train)\n",
|
| 99 |
-
"y_test = to_categorical(y_test)\n",
|
| 100 |
-
"\n",
|
| 101 |
-
"# Let's count the number columns in our hot encoded matrix \n",
|
| 102 |
-
"print (\"Number of Classes: \" + str(y_test.shape[1]))\n",
|
| 103 |
-
"\n",
|
| 104 |
-
"num_classes = y_test.shape[1]\n",
|
| 105 |
-
"num_pixels = x_train.shape[1] * x_train.shape[2]\n",
|
| 106 |
-
"\n",
|
| 107 |
-
"# create model\n",
|
| 108 |
-
"model = Sequential()\n",
|
| 109 |
-
"\n",
|
| 110 |
-
"model.add(Conv2D(32, kernel_size=(3, 3), activation='relu', input_shape=input_shape))\n",
|
| 111 |
-
"model.add(Conv2D(64, (3, 3), activation='relu'))\n",
|
| 112 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 113 |
-
"model.add(Dropout(0.25))\n",
|
| 114 |
-
"model.add(Flatten())\n",
|
| 115 |
-
"model.add(Dense(128, activation='relu'))\n",
|
| 116 |
-
"model.add(Dropout(0.5))\n",
|
| 117 |
-
"model.add(Dense(num_classes, activation='softmax'))\n",
|
| 118 |
-
"\n",
|
| 119 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 120 |
-
" optimizer = SGD(0.01),\n",
|
| 121 |
-
" metrics = ['accuracy'])\n",
|
| 122 |
-
"\n",
|
| 123 |
-
"# We can use the summary function to display our model layers and parameters\n",
|
| 124 |
-
"print(model.summary())\n",
|
| 125 |
-
"\n",
|
| 126 |
-
"history = model.fit(x_train, y_train,\n",
|
| 127 |
-
" batch_size=batch_size,\n",
|
| 128 |
-
" epochs=epochs,\n",
|
| 129 |
-
" verbose=1,\n",
|
| 130 |
-
" validation_data=(x_test, y_test))\n",
|
| 131 |
-
"\n",
|
| 132 |
-
"score = model.evaluate(x_test, y_test, verbose=0)\n",
|
| 133 |
-
"print('Test loss:', score[0])\n",
|
| 134 |
-
"print('Test accuracy:', score[1])"
|
| 135 |
-
]
|
| 136 |
-
},
|
| 137 |
-
{
|
| 138 |
-
"cell_type": "markdown",
|
| 139 |
-
"metadata": {},
|
| 140 |
-
"source": [
|
| 141 |
-
"#### Let's save our history file"
|
| 142 |
-
]
|
| 143 |
-
},
|
| 144 |
-
{
|
| 145 |
-
"cell_type": "code",
|
| 146 |
-
"execution_count": 2,
|
| 147 |
-
"metadata": {},
|
| 148 |
-
"outputs": [],
|
| 149 |
-
"source": [
|
| 150 |
-
"import pickle \n",
|
| 151 |
-
"\n",
|
| 152 |
-
"pickle_out = open(\"MNIST_history.pickle\",\"wb\")\n",
|
| 153 |
-
"pickle.dump(history.history, pickle_out)\n",
|
| 154 |
-
"pickle_out.close()"
|
| 155 |
-
]
|
| 156 |
-
},
|
| 157 |
-
{
|
| 158 |
-
"cell_type": "markdown",
|
| 159 |
-
"metadata": {},
|
| 160 |
-
"source": [
|
| 161 |
-
"#### Loading out saved history is as simple as these two lines"
|
| 162 |
-
]
|
| 163 |
-
},
|
| 164 |
-
{
|
| 165 |
-
"cell_type": "code",
|
| 166 |
-
"execution_count": 3,
|
| 167 |
-
"metadata": {},
|
| 168 |
-
"outputs": [
|
| 169 |
-
{
|
| 170 |
-
"name": "stdout",
|
| 171 |
-
"output_type": "stream",
|
| 172 |
-
"text": [
|
| 173 |
-
"{'loss': [0.7766780077775319], 'accuracy': [0.75701666], 'val_loss': [0.255510806620121], 'val_accuracy': [0.9265]}\n"
|
| 174 |
-
]
|
| 175 |
-
}
|
| 176 |
-
],
|
| 177 |
-
"source": [
|
| 178 |
-
"pickle_in = open(\"MNIST_history.pickle\",\"rb\")\n",
|
| 179 |
-
"saved_history = pickle.load(pickle_in)\n",
|
| 180 |
-
"print(saved_history)"
|
| 181 |
-
]
|
| 182 |
-
},
|
| 183 |
-
{
|
| 184 |
-
"cell_type": "code",
|
| 185 |
-
"execution_count": 5,
|
| 186 |
-
"metadata": {},
|
| 187 |
-
"outputs": [
|
| 188 |
-
{
|
| 189 |
-
"data": {
|
| 190 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEJCAYAAACZjSCSAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAe3ElEQVR4nO3dfXhU5bnv8e9tCMYSBIQKamxDC1ohhhBDEEXlRVC0YlW2EPEF+4LaWq0c3UUvq5Ztz+ZYVNRiW3RDa1XYtG6VFgQqJUdRq4AiClwYpCgRjgKtQBBfovf5Y4Y4hAkJmayZJM/vc11zOWutZ9bcd4j5zVpr5hlzd0REJFyHZLoAERHJLAWBiEjgFAQiIoFTEIiIBE5BICISOAWBiEjgIg0CMzvbzNaZ2Xozm5hk+9fMbImZvWZmq8zsnCjrERGR/VlUnyMwsyzgLWAYUAksA8rcfU3CmOnAa+7+azPrBcx39/xIChIRkaTaRLjvUmC9u28AMLPZwPnAmoQxDhwev98B2FzfTrt06eL5+flNW2ka7N69m3bt2mW6jLQKrefQ+gX13JKsWLFim7t/Ndm2KIPgGGBTwnIl0L/WmDuARWb2Y6AdcGayHZnZeGA8QNeuXZkyZUqTFxu1qqoqcnNzM11GWoXWc2j9gnpuSQYPHvxOXduiDAJLsq72eagy4HfufreZDQD+YGYF7v7FPg9ynw5MBygpKfFBgwZFUW+kysvLaYl1pyK0nkPrF9RzaxHlxeJK4NiE5Tz2P/XzPWAOgLu/BOQAXSKsSUREaokyCJYBPc2su5m1BcYAc2uNeRcYCmBmJxALgq0R1iQiIrVEdmrI3avN7FpgIZAFzHD31WY2CVju7nOB/wU8ZGY3EDttNM41HapIvT777DMqKyv5+OOPM1pHhw4dWLt2bUZrSLfm3nNOTg55eXlkZ2c3+DFRXiPA3ecD82utuy3h/hrg1ChrEInc0nuhzyVpfcrKykrat29Pfn4+Zskux6XHrl27aN++fcaePxOac8/uzvbt26msrKR79+4Nfpw+WSySqqoP4IX70vqUH3/8MZ07d85oCEjzY2Z07tz5oI8UFQQiqTr1enj9cdp+8q+0Pq1CQJJpzO+FgkAkVe27QZ8yjt30P5mupF73/vWtTJcgzZCCQKQpHHcWx1bOhTs67Htb8p+Zrmwf9y2uaJL9DBo0iIULF+6zburUqfzwhz884OP2fhBr8+bNjBo1qs59L1++/ID7mTp1Kh999FHN8jnnnMOHH37YkNKTeumll+jevTtFRUUUFRWRm5vL8ccfT1FREZdffvlB7euLL75g8uTJdW7Py8tLqdYoKAhEUvXpbpj/76z91vVwx459b4NvznR1kSgrK2P27Nn7rJs9ezZlZWUNevzRRx/Nn/70p0Y/f+0gmD9/Ph07dmz0/hYsWMCUKVNYuXIlK1eupKSkhMcee4yVK1fyyCOPHNS+6guC5khBIJKqeTdCXgnvdxuS6UrSZtSoUfzlL3/hk08+AWDjxo1s3ryZgQMHUlVVxdChQykuLubEE0/k6aef3u/xGzdupKCgAIA9e/YwZswYCgsLGT16NHv27KkZd80111BSUkLv3r25/fbbAbj//vvZvHkzgwcPZvDgwQDk5+ezbds2AO655x4KCgooKChg6tSpNc93wgkn8IMf/IDevXszfPjwfZ5n8eLFnHlm0hluAKiurmbChAmUlpYyYMAAHn74YQDee+89Bg4cSFFREQUFBbz44otMnDiRXbt2HdTRxLZt2xg5ciSFhYWccsopvPnmmwD87W9/o0+fPhQVFVFcXMzu3buTPmeqIn37qEir99pjsPlV+MHf4MVlGSkhf+K8SMZvnHxunds6d+5MaWkpzz77LGPGjGH27NmMHj0aMyMnJ4cnn3ySww8/nG3btnHyySczcuTIOi9i/vrXv+YrX/kKq1atYtWqVRQXF9ds+8UvfsERRxzB559/ztChQ1m1ahXXXXcd99xzD0uWLKFLl30nIlixYgUzZ87k5Zdfxt3p378/Z5xxBp06daKiooJZs2bx0EMPcfHFF/PEE09w6aWXsm3bNrKzs+nQoUOd/U6fPp0jjzySV155hW3btjFs2DCGDx/OrFmzOO+88/jpT3/K559/zp49eygtLeXhhx9m5cqVDfo5A/zsZz+jf//+zJ07l0WLFjFu3DiWL1/OL3/5S6ZPn07//v2pqqoiJyeHRx99dL/nTJWOCERSsXsr/NvvoG3Lm40yVWVlZTWndxJPC7k7t9xyC4WFhZx55pm89957vP/++3Xu57nnnuPSSy8FoLCwkMLCwpptc+bMobi4mL59+7J69WrWrFlT124AWLp0KRdccAHt2rUjNzeXCy+8kOeffx6g5hoAwEknncTGjRsBWLRoEcOHDz/gfhctWsTMmTMpKipiyJAhfPjhh1RUVNCvXz8efvhhfv7zn/Pmm282ejK6pUuXctlllwEwfPhwNm/ezO7duzn11FP5yU9+wgMPPMDOnTvJyspqsudMpCMCkVQM/EmmKzjgK/fa8ifOO6jxB/Kd73yHG264gVdffZU9e/bUvJJ/7LHH2Lp1KytWrCA7O5v8/Px639ee7GjhH//4B1OmTGHZsmV06tSJcePG1bufA01McOihh9bcz8rKqnkl/cwzzzBhwoR69/vggw8ydOjQ/T5QVl5ezrx58xg7diw333wzo0ePPuC+GlL33uVbb72VkSNHMm/ePPr160d5eTlDhgzZ7znHjh170M+ZSEcEItIoubm5nHbaaXz3u9/d5yLxjh07OPLII8nOzmbJkiW8806dsx8DcPrpp/PYY48B8Oabb7Jq1SoAdu7cSbt27ejQoQPvv/8+zzzzTM1j2rdvz65du5Lu66mnnuKjjz5i9+7dPPnkk5x22ml1Pre7s2rVqpojhbqcddZZPPjgg1RXVwOwbt069uzZwzvvvEO3bt0YP34848aN47XXXqNNm9jr671jGyLxZ/Dss8+Sl5dHu3btePvttyksLOTmm2+mb9++rFu3LulzpkpHBCLSaKNGjWLs2LH7vINo7NixnHfeeZSUlFBUVMS3vvWtA+7jmmuu4corr6SwsJCioiJKS0sB6NOnD3379qV379584xvf4NRTv5yNZvz48YwYMYKjjjqKJUuW1KwvLi5m3LhxNfv4/ve/T9++fWtOA9W2YsUK+vbtW++HsK666ireffddioqK+OKLL+jWrRtPP/00ixcv5p577iE7O5vc3FweffRRAL73ve9RWFhISUlJ0ncd9e7du+Y5L7nkEiZNmlTzM8jNzWXmzJkATJkyheeff55DDjmEwsJChg8fzqOPPpr0OVMR2VdVRqWkpMTre49xc9Qa5zCvT2g9p7PftWvXcsIJJxz045ry1BA073l3GuLOO++kR48ejBkzpsGPaQk9J/v9MLMV7l6SbLyOCEQCcv3QnpkuoVm59dZbM11Cs6BrBCIBuWHYcZkuQZohBYGISOAUBCIigVMQiIgETkEgIhI4BYFICJbeC7vqnubhYG3fvp2ioiJOPfVUunXrxjHHHFMzhfOnn37aoH1ceeWVrFu37oBjpk2bVvNBq1QNHDjwoOb/CYnePioSgr1fp3n2/26S3XXu3JmVK1eya9cu7r77bnJzc7nxxhv3GePuuDuHHJL89ebeD00dyI9+9KMmqVcOTEcEIiGIf51mUx4VJLN+/XoKCgq4+uqrKS4uZsuWLYwfP75mKulJkybVjN37Cr26upqOHTsyceJE+vTpw4ABA/jggw+A2Pv8904lPXDgQCZOnEhpaSnHH398zfTLu3fv5qKLLqJPnz6UlZVRUlLS4Ff+e/bs4YorruDEE0+kuLiY5557DoA33niDfv36UVRURGFhIRs2bGDXrl2MGDGCU045hYKCgpS+T6G50RGBSEt3R93TJ+/n7oP4HMEdOw6+FmDNmjXMnDmT3/zmNwBMnjyZI444gurqagYPHsyoUaPo1avXPo/ZsWMHZ5xxBpMnT2bChAnMmDGDiRMn7rdvd+eVV15h7ty5TJo0iQULFvDAAw/QrVs3nnjiCV5//fV9prGuz/3330/btm154403WL16Neeccw4VFRU8+OCD3HjjjYwePZpPPvkEd+fpp58mPz+fOXPm0L59e3bsaNzPpznSEYGINKlvfvOb9OvXr2Z51qxZFBcXU1xczNq1a5NOJX3YYYcxYsQIYN8pomu78MIL9xuzdOnSmiki+vTpQ+/evRtca+L0z7179+boo49m/fr1nHLKKdx5553cddddbNq0iZycHAoLC1mwYAG33347L7zwwgG/v6Cl0RGBSEvXkFfun+6G6YNj02YXXRJpOe3affndDBUVFdx333288sordOzYkUsvvTTpVNJt27atuZ+VlVXnzJ17p5JOHJPKfGl1Pfayyy5jwIABzJs3j2HDhvH73/+e008/neXLl/PEE09w00038e1vf5tbbrml0c/dnOiIQCQE8a/TjDoEatu5cyft27fn8MMPZ8uWLft94X1TGDhwIHPmzAFi5/br+/KaRInTP69du5YtW7bQo0cPNmzYQI8ePbj++us599xzWbVqFe+99x65ubmUlZUxYcIEXn311SbvJVN0RCDS2iV+nWaaFRcX06tXLwoKCvabSrqp/PjHP+byyy+nsLCQ4uJiCgoK6jxtc9ZZZ5GdnQ3AaaedxowZM7jqqqs48cQTyc7O5pFHHqFt27Y8/vjjzJo1i+zsbI4++mjuvPPOmu8jBsjJyam5BtIaaBrqNAltSmYIr+dmOw310qlw3Flw5MFPW12f5jAlc3V1NdXV1eTk5FBRUcHw4cOpqKio+YKYptYceq6PpqEWkX01g6/TjFJVVRVDhw6luroad+e3v/1tZCHQWumnJSItWseOHVmxYkWmy2jRIr1YbGZnm9k6M1tvZvu9KdjM7jWzlfHbW2b2YZT1iLQmLe20rqRHY34vIjsiMLMsYBowDKgElpnZXHevuaTv7jckjP8x0DeqekRak5ycHLZv307nzp3r/b5dCYe7s337dnJycg7qcVGeGioF1rv7BgAzmw2cD9T13q4y4PYI6xFpNfLy8qisrGTr1q0ZrePjjz8+6D86LV1z7zknJ4e8vLyDekyUQXAMsClhuRLon2ygmX0d6A4kfX+bmY0HxgN07dqV8vLyJi00Haqqqlpk3akIrefQ+oVYz7m5uZkuI61aQs/vvPPOQY2PMgiSHa/WdfJqDPAnd/882UZ3nw5Mh9jbR1viWxJDeyslhNdzaP2Cem4torxYXAkcm7CcB2yuY+wYYFaEtYiISB2iDIJlQE8z625mbYn9sZ9be5CZHQ90Al6KsBYREalDZEHg7tXAtcBCYC0wx91Xm9kkMxuZMLQMmO16L5yISEZE+oEyd58PzK+17rZay3dEWYOIiByYZh8VEQmcgkBEJHAKAhGRwCkIREQCpyAQEQmcgkBEJHAKAhGRwCkIREQCpyAQEQmcgkBEJHAKAhGRwCkIREQCpyAQEQmcgkBEJHAKAhGRwCkIREQCpyAQEQmcgkBEJHAKAhGRwCkIREQCpyAQEQmcgkBEJHAKAhGRwCkIREQCpyAQEQmcgkBEJHAKAhGRwCkIREQCpyAQEQlcpEFgZmeb2TozW29mE+sYc7GZrTGz1Wb2eJT1iIjI/tpEtWMzywKmAcOASmCZmc119zUJY3oCNwOnuvu/zOzIqOoREZHkojwiKAXWu/sGd/8UmA2cX2vMD4Bp7v4vAHf/IMJ6REQkiciOCIBjgE0Jy5VA/1pjjgMwsxeALOAOd19Qe0dmNh4YD9C1a1fKy8ujqDdSVVVVLbLuVITWc2j9gnpuLaIMAkuyzpM8f09gEJAHPG9mBe7+4T4Pcp8OTAcoKSnxQYMGNXmxUSsvL6cl1p2K0HoOrV9Qz61FlKeGKoFjE5bzgM1Jxjzt7p+5+z+AdcSCQURE0iTKIFgG9DSz7mbWFhgDzK015ilgMICZdSF2qmhDhDWJiEgtkQWBu1cD1wILgbXAHHdfbWaTzGxkfNhCYLuZrQGWADe5+/aoahIRkf1FeY0Ad58PzK+17raE+w5MiN9ERCQD9MliEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERALXoCAws2+a2aHx+4PM7Doz6xhtaSIikg4NPSJ4AvjczHoA/wV0Bx6PrCoREUmbhgbBF+5eDVwATHX3G4CjoitLRETSpaFB8JmZlQFXAH+Jr8uOpiQREUmnhgbBlcAA4Bfu/g8z6w48Gl1ZIiKSLm0aMsjd1wDXAZhZJ6C9u0+OsjAREUmPhr5rqNzMDjezI4DXgZlmdk+0pYmISDo09NRQB3ffCVwIzHT3k4AzoytLRETSpaFB0MbMjgIu5suLxSIi0go0NAgmAQuBt919mZl9A6iIriwREUmXhl4s/iPwx4TlDcBFURUlIiLp09CLxXlm9qSZfWBm75vZE2aW14DHnW1m68xsvZlNTLJ9nJltNbOV8dv3G9OEiIg0XkNPDc0E5gJHA8cAf46vq5OZZQHTgBFAL6DMzHolGfrf7l4Uvz3c4MpFRKRJNDQIvuruM929On77HfDVeh5TCqx39w3u/ikwGzg/hVpFRCQCDbpGAGwzs0uBWfHlMmB7PY85BtiUsFwJ9E8y7iIzOx14C7jB3TfVHmBm44HxAF27dqW8vLyBZTcfVVVVLbLuVITWc2j9gnpuLRoaBN8FfgXcCzjwIrFpJw7EkqzzWst/Bma5+ydmdjXwe2DIfg9ynw5MBygpKfFBgwY1sOzmo7y8nJZYdypC6zm0fkE9txYNOjXk7u+6+0h3/6q7H+nu3yH24bIDqQSOTVjOAzbX2u92d/8kvvgQcFID6xYRkSaSyjeUTahn+zKgp5l1N7O2wBhiF5xrxD+kttdIYG0K9YiISCM09NRQMslO/dRw92ozu5bYB9GygBnuvtrMJgHL3X0ucJ2ZjQSqgX8C41KoR0REGiGVIKh9vn//Ae7zgfm11t2WcP9m4OYUahARkRQdMAjMbBfJ/+AbcFgkFYmISFodMAjcvX26ChERkcxI5WKxiIi0AgoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAIXaRCY2dlmts7M1pvZxAOMG2VmbmYlUdYjIiL7iywIzCwLmAaMAHoBZWbWK8m49sB1wMtR1SIiInWL8oigFFjv7hvc/VNgNnB+knH/AdwFfBxhLSIiUoc2Ee77GGBTwnIl0D9xgJn1BY5197+Y2Y117cjMxgPjAbp27Up5eXnTVxuxqqqqFll3KkLrObR+QT23FlEGgSVZ5zUbzQ4B7gXG1bcjd58OTAcoKSnxQYMGNU2FaVReXk5LrDsVofUcWr+gnluLKE8NVQLHJiznAZsTltsDBUC5mW0ETgbm6oKxiEh6RRkEy4CeZtbdzNoCY4C5eze6+w537+Lu+e6eD/wdGOnuyyOsSUREaoksCNy9GrgWWAisBea4+2ozm2RmI6N6XhEROThRXiPA3ecD82utu62OsYOirEVERJLTJ4tFRAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAKnIBARCZyCQEQkcAoCEZHAKQhERAIXaRCY2dlmts7M1pvZxCTbrzazN8xspZktNbNeUdYjIiL7iywIzCwLmAaMAHoBZUn+0D/u7ie6exFwF3BPVPWIiEhyUR4RlALr3X2Du38KzAbOTxzg7jsTFtsBHmE9IiKSRJsI930MsClhuRLoX3uQmf0ImAC0BYYk25GZjQfGA3Tt2pXy8vKmrjVyVVVVLbLuVITWc2j9gnpuLaIMAkuybr9X/O4+DZhmZpcAtwJXJBkzHZgOUFJS4oMGDWraStOgvLycllh3KkLrObR+QT23FlGeGqoEjk1YzgM2H2D8bOA7EdYjIiJJRBkEy4CeZtbdzNoCY4C5iQPMrGfC4rlARYT1iIhIEpGdGnL3ajO7FlgIZAEz3H21mU0Clrv7XOBaMzsT+Az4F0lOC4mISLSivEaAu88H5tdad1vC/eujfH4REamfPlksIhI4BYGISOAUBCIigVMQiIgETkEgIhI4BYGISOAUBCJN5MmKTzNdgkijKAhEmsjTb3+W6RJEGkVBICISOAWBiEjgFAQiIoGLdK4hkdbq3r++xX2L958sN3/ivH2Wrx/akxuGHZeuskQaRUEg0gg3DDtuvz/w+RPnsXHyuRmqSKTxdGpIRCRwCgIRkcApCEREAqcgEGki538zO9MliDSKgkCkiVzQs22mSxBpFAWBiEjgFAQiIoFTEIiIBM7cPdM1HBQz2wq8k+k6GqELsC3TRaRZaD2H1i+o55bk6+7+1WQbWlwQtFRmttzdSzJdRzqF1nNo/YJ6bi10akhEJHAKAhGRwCkI0md6pgvIgNB6Dq1fUM+tgq4RiIgETkcEIiKBUxCIiAROQZAiMzvbzNaZ2Xozm5hk+9fNbLGZrTKzcjPLS9j2NTNbZGZrzWyNmeWns/bGSrHnu8xsdbzn+83M0lt945jZDDP7wMzerGO7xftZH++7OGHbFWZWEb9dkb6qG6+x/ZpZkZm9FP83XmVmo9NbeeOl8m8c3364mb1nZr9KT8VNyN11a+QNyALeBr4BtAVeB3rVGvNH4Ir4/SHAHxK2lQPD4vdzga9kuqcoewZOAV6I7yMLeAkYlOmeGtj36UAx8GYd288BngEMOBl4Ob7+CGBD/L+d4vc7ZbqfCPs9DugZv380sAXomOl+ouw5Yft9wOPArzLdy8HedESQmlJgvbtvcPdPgdnA+bXG9AIWx+8v2bvdzHoBbdz9rwDuXuXuH6Wn7JQ0umfAgRxiAXIokA28H3nFTcDdnwP+eYAh5wOPeMzfgY5mdhRwFvBXd/+nu/8L+CtwdvQVp6ax/br7W+5eEd/HZuADIOmnWZubFP6NMbOTgK7AougrbXoKgtQcA2xKWK6Mr0v0OnBR/P4FQHsz60zsldOHZvY/Zvaamf3SzLIirzh1je7Z3V8iFgxb4reF7r424nrTpa6fS0N+Xi1RvX2ZWSmx0H87jXVFKWnPZnYIcDdwU0aqagIKgtQkO79d+/24NwJnmNlrwBnAe0A10AY4Lb69H7FTLeMiq7TpNLpnM+sBnADkEfufaoiZnR5lsWlU18+lIT+vluiAfcVfKf8BuNLdv0hbVdGqq+cfAvPdfVOS7S1Cm0wX0MJVAscmLOcBmxMHxA+PLwQws1zgInffYWaVwGvuviG+7Sli5x3/Kx2FpyCVnscDf3f3qvi2Z4j1/Fw6Co9YXT+XSmBQrfXlaasqOnX+HpjZ4cA84Nb4KZTWoq6eBwCnmdkPiV3ra2tmVe6+3xspmisdEaRmGdDTzLqbWVtgDDA3cYCZdYkfOgLcDMxIeGwnM9t7/nQIsCYNNacqlZ7fJXak0MbMsokdLbSWU0Nzgcvj7yw5Gdjh7luAhcBwM+tkZp2A4fF1LV3SfuO/E08SO5f+x8yW2OSS9uzuY939a+6eT+xo+JGWFAKgI4KUuHu1mV1L7H/sLGCGu682s0nAcnefS+zV4H+amRN75fuj+GM/N7MbgcXxt1CuAB7KRB8HI5WegT8RC7w3iB1SL3D3P6e7h8Yws1nE+uoSP5q7ndjFbtz9N8B8Yu8qWQ98BFwZ3/ZPM/sPYgEKMMndD3RBsllobL/AxcTefdPZzMbF141z95VpK76RUui5xdMUEyIigdOpIRGRwCkIREQCpyAQEQmcgkBEJHAKAhGRwCkIROLM7HMzW5lwa7L3gptZfl2zWopkmj5HIPKlPe5elOkiRNJNRwQi9TCzjWb2f8zslfitR3x94vcuLDazr8XXdzWzJ83s9fjtlPiusszsofhc/YvM7LD4+Oss9n0Uq8xsdobalIApCES+dFitU0OJX6qy091LgV8BU+PrfkVsOoFC4DHg/vj6+4H/6+59iM1vvzq+vicwzd17Ax/y5QytE4G+8f1cHVVzInXRJ4tF4uITheUmWb8RGOLuG+JzJP0/d+9sZtuAo9z9s/j6Le7excy2Annu/knCPvKJfS9Bz/jyT4Fsd7/TzBYAVcBTwFN7J+UTSRcdEYg0jNdxv64xyXyScP9zvrxGdy4wDTgJWGFmunYnaaUgEGmY0Qn/fSl+/0Vis68CjAWWxu8vBq4BMLOs+LTMScVnaT3W3ZcA/w50JDaVsUja6JWHyJcOM7PEWTIXJEwnfKiZvUzsxVNZfN11wAwzuwnYypezUV4PTDez7xF75X8NsW9kSyYLeNTMOhD74pN73f3DJutIpAF0jUCkHvFrBCXuvi3TtYhEQaeGREQCpyMCEZHA6YhARCRwCgIRkcApCEREAqcgEBEJnIJARCRw/x/oJ0Z2X27fKgAAAABJRU5ErkJggg==\n",
|
| 191 |
-
"text/plain": [
|
| 192 |
-
"<Figure size 432x288 with 1 Axes>"
|
| 193 |
-
]
|
| 194 |
-
},
|
| 195 |
-
"metadata": {
|
| 196 |
-
"needs_background": "light"
|
| 197 |
-
},
|
| 198 |
-
"output_type": "display_data"
|
| 199 |
-
}
|
| 200 |
-
],
|
| 201 |
-
"source": [
|
| 202 |
-
"# Plotting our loss charts\n",
|
| 203 |
-
"import matplotlib.pyplot as plt\n",
|
| 204 |
-
"\n",
|
| 205 |
-
"history_dict = history.history\n",
|
| 206 |
-
"\n",
|
| 207 |
-
"loss_values = history_dict['loss']\n",
|
| 208 |
-
"val_loss_values = history_dict['val_loss']\n",
|
| 209 |
-
"epochs = range(1, len(loss_values) + 1)\n",
|
| 210 |
-
"\n",
|
| 211 |
-
"line1 = plt.plot(epochs, val_loss_values, label='Validation/Test Loss')\n",
|
| 212 |
-
"line2 = plt.plot(epochs, loss_values, label='Training Loss')\n",
|
| 213 |
-
"plt.setp(line1, linewidth=2.0, marker = '+', markersize=10.0)\n",
|
| 214 |
-
"plt.setp(line2, linewidth=2.0, marker = '4', markersize=10.0)\n",
|
| 215 |
-
"plt.xlabel('Epochs') \n",
|
| 216 |
-
"plt.ylabel('Loss')\n",
|
| 217 |
-
"plt.grid(True)\n",
|
| 218 |
-
"plt.legend()\n",
|
| 219 |
-
"plt.show()"
|
| 220 |
-
]
|
| 221 |
-
},
|
| 222 |
-
{
|
| 223 |
-
"cell_type": "code",
|
| 224 |
-
"execution_count": 7,
|
| 225 |
-
"metadata": {},
|
| 226 |
-
"outputs": [
|
| 227 |
-
{
|
| 228 |
-
"data": {
|
| 229 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEGCAYAAABy53LJAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3deXxU9b3/8deHsFkTFkHBCwrYUitSCBBBflAN0rLYK6igglrFqrhWr9YF13pR79XeuhZB0apFJZHaC6UFhIrErWiRCihwkUXQACKyBwEJfH5/zMk4DJNkDslkEvJ+Ph7zyDnf8z3f+XxCyCdnme8xd0dERCRZddIdgIiI1CwqHCIiEooKh4iIhKLCISIioahwiIhIKHXTHUBVaN68ubdt2zbdYYSyc+dOjjzyyHSHUaWUc+2gnGuO+fPnf+3uR8e314rC0bZtWz788MN0hxFKQUEBubm56Q6jSinn2kE51xxmtiZRu05ViYhIKCocIiISigqHiIiEUiuucYhUV5OXf0vJqe+9e/dSWFjI7t270xpTqjVu3JilS5emO4wqVd1zbtiwIa1bt6ZevXpJ9VfhEEmjv6zcyxPBcmFhIVlZWbRt2xYzS2tcqbRjxw6ysrLSHUaVqs45uzubNm2isLCQdu3aJbWPTlWJVBO7d++mWbNmh3XRkOrHzGjWrFmoI10VDpFqREVD0iHsz11KC4eZDTCzZWa2wsxGJdjexsxmm9kiMysws9ZBe7aZzTWzxcG2C2L2edHMPjOzBcErO5U5iFR3j/3903SHILVMygqHmWUATwEDgQ7AcDPrENftd8AEd+8EjAb+O2j/BrjE3U8GBgCPm1mTmP1udffs4LUgVTmIVKbH/v4pbUdNO+AFRJcLt+xiUeFWNmwPd3H8idnLKyW+3NxcZs6ceUDb448/zrXXXlvmfpmZmQCsW7eOoUOHljp2eR/Cffzxx/nmm2+i62eeeSZbt25NJvSE5s6dS7t27cjOziY7O5vMzExOPPFEsrOzueSSS0KNtX//fh566KEy+8ybNw8zY/bs2Yccc43h7il5AT2BmTHrdwB3xPVZDLQOlg3YXspYC4H2wfKLwNAwsXTr1s1rmjlz5qQ7hCpXG3Nuc/vfostLliyp8BgV8fTTT/uIESMOaOvRo4e//fbbZe535JFHljv26aef7vPmzXN39+3btyfs06ZNG9+4cWOS0Zbv3nvv9ddeey1hDGHt3bvXGzduXGafm266yXv37u2XX375QdtKy/lQY0mFRD9/wIee4HdqKu+qagV8EbNeCPSI67MQGAI8AZwDZJlZM3ffVNLBzLoD9YGVMfs9aGb3ArOBUe6+J/7NzWwkMBKgRYsWFBQUVDihqlRUVFTjYq6o2pgzEM25cePG7Nix45DGONT9YvXv35+77rqLr7/+mgYNGrBmzRrWrl1L586dWb9+PcOHD2fr1q3s3buXe+65h5///OcHvP+aNWs4//zz+eCDD9i1axfXXHMNy5Yt48QTT6SoqIidO3eyY8cObrzxRhYsWMCuXbsYPHgwd911F+PGjWPdunWcfvrpNGvWjGnTptGxY0feeustmjVrxpgxY3jppZcAuOSSS7juuutYs2YNQ4YMoWfPnnzwwQcce+yx5Ofnc8QRRwAwa9Ysrrzyyuj3Zt++fdEYAIqLi7nnnnuYO3cue/bs4eqrr+bSSy9l3bp1jBgxgp07d1JcXMyTTz7J1KlT2bFjB506deLkk09m/PjxB3zv9u/fz2uvvcbf/vY3BgwYwKZNm6hfvz4AL7/8MmPGjKFOnTp07tyZcePGsWHDBm688UbWrFmDmfHEE09w1FFHcckll/Dee+8B8Oijj1JcXMxtt91Gv3796N27N//4xz8466yzaNu2Lb/73e/Yu3cvzZo147nnnuPoo49mx44d3HLLLSxcuBAz46677uKrr75i1apVPPDAAwA899xzrFmzhvvvv/+AHHbv3p30/79UFo5EV1vin1N7CzDGzEYAbwNrgeLoAGbHAi8Bl7r7/qD5DuBLIsVkPHA7kdNcB76R+/hgOzk5OV7T5ompqXPbVERtzJnXp0VzXrp0afSWzZLTWMn68YNvJ9Vv9UM/L3VbVlYWPXr04L333mPw4MH87W9/Y9iwYTRq1Ijvfe97TJ06lUaNGvH1119z6qmncsEFF0QvqmZlZZGZmUmdOnXIysri2WefpXHjxnzyyScsWrSIrl27cuSRR5KVlcVvfvMb2rRpw759++jbty+fffYZt912G2PHjuWtt96iefPmQOSCbWZmJp9++ikTJ05k3rx5uDs9evSgf//+NG3alJUrV/Lqq6+SnZ3N+eefz6xZs7j44ov5+uuvo59NKJGRkRGNAWDs2LG0bt2a+fPns2fPHk499VQGDRrElClTOPvss7n99tvZt28fu3btIjc3lwkTJrBo0aKE37uCggJOOukkOnXqRO/evXnvvfcYNGgQCxcu5IknnmDmzJm0adOGzZs3k5WVxYgRIxg4cCDXX389xcXFfPPNN3z11VfR7x9AgwYNyMjIICsri4yMDHbv3h0tKlu2bIl+/59++mmeeeYZHn74Ye677z7+7d/+jYkTJ+LubN26lbp165Kdnc0jjzxC3bp1ycvL48UXXzzo9uCGDRvSpUuXpH6OUlk4CoHjYtZbA+tiO7j7OuBcADPLBIa4+7ZgvREwDbjb3d+P2Wd9sLjHzF4gUnxEpBIMHz6c/Px8Bg8eTH5+Ps8//zwQOaV955138vbbb1OnTh3Wrl3Lhg0baNmyZcJx3n77bW644QYAOnXqRKdOnaLbJk+ezIQJEyguLmb9+vUsWbLkgO3x3n33Xc4555zo7LLnnnsu77zzDoMGDYpewwDo1q0bq1evBiJHG/369Ssz11mzZrF06VLy8/MB2LZtG8uXL+eUU07hqquuYvfu3Zx99tl07tyZ4uLiMsfKy8tj2LBhAAwbNoy8vDwGDRrEm2++yQUXXMBRRx0FEP1aUFAQfd+6devSqFEjvvrqqzLfo2R8gM8//5zzzz+fL7/8kj179vDDH/4QgDfeeIMpU6YAkcLbtGlTAE477TRmzJjBCSecQEZGBh06xF9uDieVhWMe0N7M2hE5khgGXBjbwcyaA5uDo4k7gOeD9vrAZCIXzv8Ut8+x7r7eIn/qnA18ksIcRFJq8PcTf1K3rCODeG1HTQvVvyxnn302N998M//617/YtWsXXbt2BeCVV15h48aNzJ8/n3r16tG2bdty7/tPdIvnZ599xpNPPsn8+fNp2rQpI0aMKHecyKn2xBo0aBBdzsjIYNeuXQDMmDGDm2++udxxx44dS9++fQ/aVlBQwLRp07jooou44447uOCCCxKMELF3714mT57M9OnT+c///E/279/P1q1b2blzJ+5e6q2u8e1169Zl//790fXdu3dTt+53v6Jjp2W/7rrruPPOOznzzDN54403ohfuS3u/K664gkcffZS2bdty2WWXlZpLslJ2V5W7FwPXAzOBpcAkd19sZqPNbFDQLRdYZmafAi2AB4P284HTgBEJbrt9xcw+Bj4GmgMPpCoHkVQ7p339dIdwgMzMTHJzc/nlL3/J8OHDo+3btm3jmGOOoV69esyZM4c1axLOth112mmn8corrwBET1cBbN++nSOPPJLGjRuzYcMGZsyYEd0nKysr4bWa0047jSlTpvDNN9+wc+dOJk+ezE9+8pNS39vdWbRoUfRIpDT9+/dn7Nix0aOJZcuWsWvXLtasWUPLli0ZOXIkI0aM4KOPPor+Ak905DFr1ixOOeUUvvjiC1avXs3nn3/OWWedxdSpU/npT39Kfn4+mzdvBoh+7dOnD08//TQQufayfft2WrZsybp169iyZQu7d+9m2rTST1du27aNVq1a4e788Y9/jLb369ePMWPGRL8PW7ZsAaBXr16sXLmSP/3pT2UWwWSldMoRd58OTI9ruzdm+TXgtQT7vQy8XMqYZ1RymCISY/jw4Zx77rnRUykAF110EWeddRY5OTlkZ2fzox/9qMwxrrnmGi677DI6depEdnY23bt3B6Bz587RC8wnnHACvXr1iu4zcuRIBg4cyLHHHsucOXOi7V27dmXEiBHRMa644gq6dOkSPS0Vb/78+XTp0qXcD7VdddVVfP7559ECc8wxx/CXv/yF2bNn8+ijj1KvXj0yMzN5+eXIr6LLL7+cTp06kZOTw4QJE6Lj5OXlcc455xww9pAhQ3jhhRf461//ym233cbAgQOpX78+3bp14w9/+ANjxozhyiuv5JlnnqFu3bo888wzdO/enTvvvJNTTjmFE044oczTSffddx/nnHMOrVu3pnv37qxfHzmD/5vf/IZrr72Wjh07kpGRwf3338+gQZG/04cOHcr//d//0bhx4zK/L0lJdKvV4fbS7bg1Q23POd2341aVyrw1NZH777/f8/LyUvoeYaU652T079/fCwoKSt1eXW7HFZEqcGPf9ukOoVq5++670x1CtbJp0yZ69uxJt27dOP300ytlTBUOkRrupp/9MN0hSDXWrFkzPv20cqel0SSHIiISigqHiIiEosIhIiKhqHCIiEgoKhwiNdW7j8GODZU23KZNm6JTkLds2ZJWrVpF17/99tukxrjssstYtmxZmX3Gjx8f/XBgZdiwYQN169blD3/4Q6WNKWXTXVUiNVXRV/DeEzDgvypluGbNmrFgQeTxNvfddx+ZmZnccsuBU8GV3Mdfp07ivzlfeOGFct9n5MiRlfr87VdffZWePXuSl5fH5ZdfXmnjxisuLj5gCpDaTEccIjVVrxth4cRKPepIZMWKFXTs2JGrr76arl27sn79ekaOHElOTg4nn3wyo0d/Nzl17969WbBgAcXFxTRp0oRRo0bRuXNnevbsGZ3Eb/To0Tz++OPR/qNGjaJ79+6ceOKJ/OMf/wBg586dDBkyhM6dOzN8+HBycnKiRS1eXl4ejz/+OKtWreLLL7+Mtk+bNo2uXbvSuXPn6ISHO3bs4NJLL+XHP/4xnTp1YsqUKdFYS+Tn53PFFVcAcPHFF/PrX/+aPn36cOedd/L+++/Ts2dPunTpQq9evVi+PPIQreLiYm666SY6duxIp06dGDt2LDNnzuS8886LjjtjxgzOP//8Cv97VAcqnyLV0X0hpoV4JMTnOO7bFj4WYMmSJbzwwgvR+ZUeeughjjrqKIqLi+nTpw9Dhw49aIqMbdu2cfrpp/PQQw9x88038/zzzzNq1EFPkMbd+ec//8nUqVMZPXo0r7/+Or///e9p2bIlf/7zn1m4cGF0ssV4q1evZsuWLXTr1o2hQ4cyadIkbrjhBr788kuuueYa3nnnneh05hA5kjr66KP5+OOPo9OOl2flypXMnj2bOnXqsG3bNt59910yMjJ4/fXXufvuu3n11VejzxNZuHAhGRkZbN68mSZNmnDDDTdEn83xwgsvVMoEg9WBjjhEpFzf//73OeWUU6LreXl5dO3ala5du7J06VKWLFly0D5HHHEEAwcOBA6c8jzeueeee1Cfd999NzqNeOfOnTn55JMT7puXlxedtK9kOnOIPDa2T58+tGnTBvhuOvM33niD6667Djhw2vGynHfeedFTc1u3buXcc8+lY8eO3HLLLSxevDg67tVXX01GRkb0/erUqcOFF17IxIkT2bx5M/Pnzy93qveaQkccItVRMkcG3+6E8X2g939A9oXl96+A2Cm9ly9fzhNPPME///lPmjRpwsUXX5xwavSSJ+BBZMrz0p5pUTI1emwfL2Mq9Vh5eXls2rQpOkPsunXr+Oyzz0qdXjxRe506dQ54v/hcYnO/66676N+/P9deey0rVqxgwIABpY4L8Mtf/pIhQ4awe/duLrjggmhhqel0xCFSU027BVrnpLxoxNu+fTtZWVk0atSI9evXM3PmzEp/j969ezNp0iQAPv7444RHNEuWLGHfvn2sXbuW1atXs3r1am699Vby8/Pp1asXb775ZnT695JTVYmmHa9Tpw5NmzZl+fLl7N+/n8mTJ5caV8l05gAvvvhitL1fv36MGzeOffv2HfB+xx13HM2bN+exxx5jxIgRFfumVCMqHCI10UevwLp/wZn/U+Vv3bVrVzp06EDHjh258sorD5gavbL86le/Yu3atXTq1IlHHnmEjh07HjQd+MSJExNOZz5x4kRatGjBuHHjGDx4MJ07d+aiiy4CItOOb9iwgY4dO5Kdnc0777wDwMMPP8yAAQPo27fvAY+bjXf77bdz6623HpTzVVddRcuWLenUqROdO3eOFj2ACy+8kDZt2kSf0ndYSDRl7uH20rTqNUNtzznUtOrvPOa+4dCmYU+3ZKYY37t3r+/atcvd3T/99FNv27at7927N9WhpcRVV13l48aNS3cY5dK06iKHu97/ke4IUqqoqIi+fftSXFyMu0cfeFTTZGdn07RpUx588MHyO9cgNe9fQkQOe02aNGH+/PnpDqPCSj57kuiRuDWZrnGIVCOe5N1EIpUp7M+dCodINdGwYUM2bdqk4iFVyt3ZtGkTDRs2THqflJ6qMrMBwBNABvCcuz8Ut70N8DxwNLAZuNjdC4NtlwIlz4B8wN3/GLR3A14EjgCmAze6/qfJYaB169YUFhaycePGdIeSUrt37w71S+pwUN1zbtiwYZl3k8VLWeEwswzgKeBnQCEwz8ymunvsDdm/Aya4+x/N7Azgv4FfmNlRwG+AHMCB+cG+W4BxwEjgfSKFYwAwI1V5iFSVevXq0a5du3SHkXIFBQV06dIl3WFUqcMt51SequoOrHD3Ve7+LZAPDI7r0wGYHSzPidneH/i7u28OisXfgQFmdizQyN3nBkcZE4CzU5iDiIjESeWpqlbAFzHrhUCPuD4LgSFETmedA2SZWbNS9m0VvAoTtB/EzEYSOTKhRYsWFBQUHGoeaVFUVFTjYq4o5Vw7KOeaL5WF4+CJWyKnnWLdAowxsxHA28BaoLiMfZMZM9LoPh4YD5CTk+O5ublJBV1dFBQUUNNirijlXDso55ovlYWjEDguZr01sC62g7uvA84FMLNMYIi7bzOzQiA3bt+CYMzWce0HjCkiIqmVymsc84D2ZtbOzOoDw4CpsR3MrLmZlcRwB5E7rABmAv3MrKmZNQX6ATPdfT2ww8xOtchUlJcAf0lhDiIiEidlhcPdi4HriRSBpcAkd19sZqPNbFDQLRdYZmafAi2AB4N9NwP3Eyk+84DRQRvANcBzwApgJbqjSkSkSqX0cxzuPp3ILbOxbffGLL8GvFbKvs/z3RFIbPuHQMfKjVRERJKlT46LiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCIiEooKh4iIhKLCISIioahwiIhIKCocIiISigqHiIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCIiEooKh4iIhKLCISIioaS0cJjZADNbZmYrzGxUgu3Hm9kcM/vIzBaZ2ZlB+0VmtiDmtd/MsoNtBcGYJduOSWUOIiJyoLqpGtjMMoCngJ8BhcA8M5vq7ktiut0NTHL3cWbWAZgOtHX3V4BXgnF+DPzF3RfE7HeRu3+YqthFRKR0qTzi6A6scPdV7v4tkA8MjuvjQKNguTGwLsE4w4G8lEUpIiKhmLunZmCzocAAd78iWP8F0MPdr4/pcywwC2gKHAn81N3nx42zEhjs7p8E6wVAM2Af8GfgAU+QhJmNBEYCtGjRolt+fn6l55hKRUVFZGZmpjuMKqWcawflXHP06dNnvrvnxLen7FQVYAna4n/BDwdedPdHzKwn8JKZdXT3/QBm1gP4pqRoBC5y97VmlkWkcPwCmHDQG7mPB8YD5OTkeG5uboUTqkoFBQXUtJgrSjnXDsq55kvlqapC4LiY9dYcfCrqcmASgLvPBRoCzWO2DyPuNJW7rw2+7gAmEjklJiIiVSSVhWMe0N7M2plZfSJFYGpcn8+BvgBmdhKRwrExWK8DnEfk2ghBW10zax4s1wP+HfgEERGpMik7VeXuxWZ2PTATyACed/fFZjYa+NDdpwK/Bp41s5uInMYaEXO94jSg0N1XxQzbAJgZFI0M4A3g2VTlICIiB0vlNQ7cfTqRW2xj2+6NWV4C9Cpl3wLg1Li2nUC3Sg9URESSpk+Oi4hIKCocIiISigqHiIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCIiEooKh4iIhKLCISIioahwiIhIKOUWDjO73syaVkUwIiJS/SVzxNESmGdmk8xsgJkleiSsiIjUEuUWDne/G2gP/AEYASw3s/8ys++nODYREamGkrrGETyV78vgVQw0BV4zs9+mMDYREamGyn0CoJndAFwKfA08B9zq7nuDZ4IvB25LbYgiIlKdJPPo2ObAue6+JrbR3feb2b+nJiwREamukjlVNR3YXLJiZllm1gPA3ZemKjAREamekikc44CimPWdQVu5gruwlpnZCjMblWD78WY2x8w+MrNFZnZm0N7WzHaZ2YLg9XTMPt3M7ONgzCd1l5eISNVKpnBYcHEciJyiIrlrIxnAU8BAoAMw3Mw6xHW7G5jk7l2AYcDYmG0r3T07eF0d0z4OGEnkTq/2wIAkchARkUqSTOFYZWY3mFm94HUjsCqJ/boDK9x9lbt/C+QDg+P6ONAoWG4MrCtrQDM7Fmjk7nODYjYBODuJWEREpJIkc3H8auBJIkcHDswm8hd/eVoBX8SsFwI94vrcB8wys18BRwI/jdnWzsw+ArYDd7v7O8GYhXFjtkr05mY2siTOFi1aUFBQkETI1UdRUVGNi7milHPtoJxrvnILh7t/ReQ0UliJrj143Ppw4EV3f8TMegIvmVlHYD1wvLtvMrNuwBQzOznJMUviHg+MB8jJyfHc3NxDSCF9CgoKqGkxV5Ryrh2Uc82XzLWKhsDlwMlAw5J2d/9lObsWAsfFrLfm4FNRlxNco3D3ucF7NQ+K1Z6gfb6ZrQR+GIzZupwxRUQkhZK5xvESkfmq+gNvEfllvSOJ/eYB7c2snZnVJ3LUMjWuz+dAXwAzO4lIYdpoZkcHF9cxsxOIXARf5e7rgR1mdmpwN9UlwF+SiEVERCpJMoXjB+5+D7DT3f8I/Bz4cXk7uXsxcD0wE1hK5O6pxWY22swGBd1+DVxpZguBPGBEcNH7NGBR0P4acLW7l3yW5Boin2BfAawEZiSZq4iIVIJkLo7vDb5uDa4/fAm0TWZwd59O5AOEsW33xiwvAXol2O/PwJ9LGfNDoGMy7y8iIpUvmcIxPngex91ETjVlAvekNCoREam2yiwcwUSG2919C/A2cEKVRCUiItVWmdc4gk+JX19FsYiISA2QzMXxv5vZLWZ2nJkdVfJKeWQiIlItJXONo+TzGtfFtDk6bSUiUisl88nxdlURiIiI1AzJfHL8kkTt7j6h8sMREZHqLplTVafELDck8knvfxGZmVZERGqZZE5V/Sp23cwaE5mGREREaqFk7qqK9w2RuaNERKQWSuYax1/5buryOkSe5jcplUGJiEj1lcw1jt/FLBcDa9y9sLTOIiJyeEumcHwOrHf33QBmdoSZtXX31SmNTEREqqVkrnH8Cdgfs74vaBMRkVoomcJR192/LVkJluunLiQREanOkikcG2MevISZDQa+Tl1IIiJSnSVzjeNq4BUzGxOsFxJ5ZKuIiNRCyXwAcCVwqpllAubuyTxvXEREDlPlnqoys/8ysybuXuTuO8ysqZk9UBXBiYhI9ZPMNY6B7r61ZCV4GuCZqQtJRESqs2QKR4aZNShZMbMjgAZl9I8yswFmtszMVpjZqATbjzezOWb2kZktMrMzg/afmdl8M/s4+HpGzD4FwZgLgtcxycQiIiKVI5mL4y8Ds83shWD9MuCP5e1kZhnAU8DPiFxQn2dmU919SUy3u4FJ7j7OzDoA04G2RO7aOsvd15lZR2Am0Cpmv4vc/cMkYhcRkUqWzMXx35rZIuCngAGvA22SGLs7sMLdVwGYWT4wGIgtHA40CpYbA+uC9/wops9ioKGZNXD3PUm8r4iIpFCys+N+SeTT40OIPI9jaRL7tAK+iFkv5MCjBoD7gIvNrJDI0cavONgQ4KO4ovFCcJrqHjOz5FIQEZHKUOoRh5n9EBgGDAc2Aa8SuR23T5JjJ/qF7nHrw4EX3f0RM+sJvGRmHd19fxDDycDDQL+YfS5y97VmlgX8GfgFCR4qZWYjgZEALVq0oKCgIMmwq4eioqIaF3NFKefaQTkfBtw94YvIEcZbwA9i2laV1j/B/j2BmTHrdwB3xPVZDBwXOz5wTLDcGvgU6FXGe4wAxpQXS7du3bymmTNnTrpDqHLKuXZQzjUH8KEn+J1a1qmqIUROUc0xs2fNrC+JjyJKMw9ob2btzKw+kaOXqXF9Pidy6gszO4nIo2k3mlkTYFpQaN4r6Wxmdc2sebBcD/h34JMQMYmISAWVWjjcfbK7XwD8CCgAbgJamNk4M+tX2n4x+xcD1xO5I2opkbunFpvZ6Ji5r34NXGlmC4E8YERQ5a4HfgDcE3fbbQNgZnCxfgGwFnj2kDIXEZFDksxdVTuBV4jMV3UUcB4wCpiVxL7TiVz0jm27N2Z5CdArwX4PAKV9Or1bee8rIiKpE+qZ4+6+2d2fcfczyu8tIiKHo1CFQ0RERIVDRERCUeEQEZFQVDhERCQUFQ4REQlFhUNEREJR4RARkVBUOEREJBQVDhERCUWFQ0REQlHhEBGRUFQ4REQkFBUOEREJRYVDRERCUeEQEZFQVDhERCQUFQ4REQlFhUNEREJR4RARkVBUOEREJJSUFg4zG2Bmy8xshZmNSrD9eDObY2YfmdkiMzszZtsdwX7LzKx/smOKiEhqpaxwmFkG8BQwEOgADDezDnHd7gYmuXsXYBgwNti3Q7B+MjAAGGtmGUmOKSIiKZTKI47uwAp3X+Xu3wL5wOC4Pg40CpYbA+uC5cFAvrvvcffPgBXBeMmMKSIiKVQ3hWO3Ar6IWS8EesT1uQ+YZWa/Ao4Efhqz7/tx+7YKlssbEwAzGwmMBGjRogUFBQWhE0inoqKiGhdzRSnn2kE513ypLByWoM3j1tD5lG4AAAvGSURBVIcDL7r7I2bWE3jJzDqWsW+iI6T4MSON7uOB8QA5OTmem5ubbNzVQkFBATUt5opSzrWDcq75Ulk4CoHjYtZb892pqBKXE7mGgbvPNbOGQPNy9i1vTBERSaFUXuOYB7Q3s3ZmVp/Ixe6pcX0+B/oCmNlJQENgY9BvmJk1MLN2QHvgn0mOKSIiKZSyIw53Lzaz64GZQAbwvLsvNrPRwIfuPhX4NfCsmd1E5JTTCHd3YLGZTQKWAMXAde6+DyDRmKnKQUREDpbKU1W4+3RgelzbvTHLS4Bepez7IPBgMmOKiEjV0SfHRUQkFBUOEREJRYVDRERCUeEQEZFQVDhERCQUFQ4REQlFhUNEREJR4RARkVBUOEREJBQVDhERCUWFQ0REQlHhEBGRUFQ4REQkFBUOEREJRYVDRERCUeEQEZFQVDhERCQUFQ4REQlFhUNEREJR4RARkVBSWjjMbICZLTOzFWY2KsH2x8xsQfD61My2Bu19YtoXmNluMzs72PaimX0Wsy07lTmIiMiB6qZqYDPLAJ4CfgYUAvPMbKq7Lynp4+43xfT/FdAlaJ8DZAftRwErgFkxw9/q7q+lKnYRESldKo84ugMr3H2Vu38L5AODy+g/HMhL0D4UmOHu36QgRhERCcncPTUDmw0FBrj7FcH6L4Ae7n59gr5tgPeB1u6+L27bm8Cj7v63YP1FoCewB5gNjHL3PQnGHAmMBGjRokW3/Pz8Sswu9YqKisjMzEx3GFVKOdcOyrnm6NOnz3x3z4lvT9mpKsAStJVWpYYBryUoGscCPwZmxjTfAXwJ1AfGA7cDow96I/fxwXZycnI8Nzc3ZPjpVVBQQE2LuaKUc+2gnGu+VJ6qKgSOi1lvDawrpe8wEp+mOh+Y7O57Sxrcfb1H7AFeIHJKTEREqkgqC8c8oL2ZtTOz+kSKw9T4TmZ2ItAUmJtgjIOuewRHIZiZAWcDn1Ry3CIiUoaUnapy92Izu57IaaYM4Hl3X2xmo4EP3b2kiAwH8j3uYouZtSVyxPJW3NCvmNnRRE6FLQCuTlUOIiJysFRe48DdpwPT49rujVu/r5R9VwOtErSfUXkRiohIWPrkuIiIhKLCISIioahwiIhIKCocIiISigqHiIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCLp8O5jsGNDuqMQOSQqHCLpUPQVvPdEuqMQOSQqHCLp0OtGWDiR+nu2pDsSkdBUOETSIasldB7OcV/8b7ojEQktpc/jEJEy/LA/x70/GO5rfGD76aOgzx3piUkkCSocIunw7U6YfhtLf3QjJw0bne5oRELRqSqRdJh2C7TOYUNLPdBSah4VDpGq9tErsO5fcOb/pDsSkUOiwiFS1XZuhPNehPpHpjsSkUOS0sJhZgPMbJmZrTCzUQm2P2ZmC4LXp2a2NWbbvphtU2Pa25nZB2a23MxeNbP6qcxBpNL1/g845qR0RyFyyFJWOMwsA3gKGAh0AIabWYfYPu5+k7tnu3s28Hsg9t7EXSXb3H1QTPvDwGPu3h7YAlyeqhxERORgqTzi6A6scPdV7v4tkA8MLqP/cCCvrAHNzIAzgNeCpj8CZ1dCrCIikqRU3o7bCvgiZr0Q6JGoo5m1AdoBb8Y0NzSzD4Fi4CF3nwI0A7a6e3HMmK1KGXMkMBKgRYsWFBQUHHomaVBUVFTjYq4o5Vw7KOeaL5WFwxK0eSl9hwGvufu+mLbj3X2dmZ0AvGlmHwPbkx3T3ccD4wFycnI8Nzc36cCrg4KCAmpazBWlnGsH5VzzpbJwFALHxay3BtaV0ncYcF1sg7uvC76uMrMCoAvwZ6CJmdUNjjrKGjNq/vz5X5vZmtAZpFdz4Ot0B1HFlHPtoJxrjjaJGlNZOOYB7c2sHbCWSHG4ML6TmZ0INAXmxrQ1Bb5x9z1m1hzoBfzW3d3M5gBDiVwzuRT4S3mBuPvRlZBPlTKzD909J91xVCXlXDso55ovZRfHgyOC64GZwFJgkrsvNrPRZhZ7l9RwIN/dY085nQR8aGYLgTlErnEsCbbdDtxsZiuIXPP4Q6pyEBGRg9mBv6+lujjc/kJJhnKuHZRzzadPjldf49MdQBoo59pBOddwOuIQEZFQdMQhIiKhqHCIiEgoKhxpkMTkj23MbLaZLTKzAjNrHbPteDObZWZLzWyJmbWtytgPVQVz/q2ZLQ5yfjKYeqZaM7PnzewrM/uklO0W5LIiyLlrzLZLg0k8l5vZpVUXdcUcas5mlm1mc4N/40VmdkHVRn7oKvLvHGxvZGZrzWxM1URcSdxdryp8ARnASuAEoD6wEOgQ1+dPwKXB8hnASzHbCoCfBcuZwPfSnVMqcwb+H/BeMEYGkc/75KY7pyRyPg3oCnxSyvYzgRlEZlg4FfggaD8KWBV8bRosN013PinO+YdA+2D534D1QJN055PKnGO2PwFMBMakO5cwLx1xVL1kJn/sAMwOlueUbA9mF67r7n8HcPcid/+masKukEPOmciUMg2JFJwGQD1gQ8ojriB3fxvYXEaXwcAEj3ifyIwIxwL9gb+7+2Z33wL8HRiQ+ogr7lBzdvdP3X15MMY64CugRnxotwL/zphZN6AFMCv1kVYuFY6ql2jyx/iJGhcCQ4Llc4AsM2tG5C+zrWb2v2b2kZn9TzB9fXV3yDm7+1wihWR98Jrp7ktTHG9VKO17ksz3qqYqNzcz607kj4SVVRhXKiXM2czqAI8At6YlqgpS4ah6yUz+eAtwupl9BJxOZMqWYiJTxPwk2H4KkVM/I1IWaeU55JzN7AdEZhJoTeQ/4Rlmdloqg60ipX1PwkwOWtOUmVvwl/hLwGXuvr/Kokqt0nK+Fpju7l8k2F7tpXKuKkms3Mkfg8P1cwHMLBMY4u7bzKwQ+MjdVwXbphA5b1rdp12pSM4jgffdvSjYNoNIzm9XReApVNr3pBDIjWsvqLKoUqvUnwMzawRMA+4OTukcLkrLuSfwEzO7lsi1yvpmVuTuB904Uh3piKPqRSd/tMhjb4cBU2M7mFnz4FAW4A7g+Zh9m5pZyfnfM4AlVH8VyflzIkcidc2sHpGjkcPhVNVU4JLgrptTgW3uvp7I3G79zKypRSb77Be0HQ4S5hz8TEwmci3gT+kNsdIlzNndL3L34929LZGj7Qk1pWiAjjiqnLsXm1nJ5I8ZwPMeTP4IfOjuU4n8xfnfZuZE/rK+Lth3n5ndAswObkmdDzybjjzCqEjORJ72eAbwMZFD/Nfd/a9VnUNYZpZHJKfmwZHib4hc2MfdnwamE7njZgXwDXBZsG2zmd1PpNgCjHb3si6+VhuHmjNwPpG7k5qZ2YigbYS7L6iy4A9RBXKu0TTliIiIhKJTVSIiEooKh4iIhKLCISIioahwiIhIKCocIiISigqHyCEys31mtiDmVWn34ZtZ29JmXBVJN32OQ+TQ7XL37HQHIVLVdMQhUsnMbLWZPWxm/wxePwjaY585MtvMjg/aW5jZZDNbGLz+XzBUhpk9GzynYpaZHRH0v8Eiz2JZZGb5aUpTajEVDpFDd0TcqarYBxBtd/fuwBjg8aBtDJGpJToBrwBPBu1PAm+5e2ciz3ZYHLS3B55y95OBrXw3e/AooEswztWpSk6kNPrkuMghCialy0zQvho4w91XBfNrfenuzczsa+BYd98btK939+ZmthFo7e57YsZoS+S5HO2D9duBeu7+gJm9DhQBU4ApJRNAilQVHXGIpIaXslxan0T2xCzv47trkj8HngK6AfPNTNcqpUqpcIikxgUxX+cGy/8gMjMwwEXAu8HybOAaADPLCKYYTyiYQfg4d58D3AY0ITItt0iV0V8qIofuCDOLncH19ZipsRuY2QdE/jgbHrTdADxvZrcCG/luptQbgfFmdjmRI4triDztMJEM4GUza0zkIUGPufvWSstIJAm6xiFSyYJrHDnu/nW6YxFJBZ2qEhGRUHTEISIioeiIQ0REQlHhEBGRUFQ4REQkFBUOEREJRYVDRERC+f8p1HH4rJNybgAAAABJRU5ErkJggg==\n",
|
| 230 |
-
"text/plain": [
|
| 231 |
-
"<Figure size 432x288 with 1 Axes>"
|
| 232 |
-
]
|
| 233 |
-
},
|
| 234 |
-
"metadata": {
|
| 235 |
-
"needs_background": "light"
|
| 236 |
-
},
|
| 237 |
-
"output_type": "display_data"
|
| 238 |
-
}
|
| 239 |
-
],
|
| 240 |
-
"source": [
|
| 241 |
-
"# Plotting our accuracy charts\n",
|
| 242 |
-
"import matplotlib.pyplot as plt\n",
|
| 243 |
-
"\n",
|
| 244 |
-
"history_dict = history.history\n",
|
| 245 |
-
"\n",
|
| 246 |
-
"acc_values = history_dict['accuracy']\n",
|
| 247 |
-
"val_acc_values = history_dict['val_accuracy']\n",
|
| 248 |
-
"epochs = range(1, len(loss_values) + 1)\n",
|
| 249 |
-
"\n",
|
| 250 |
-
"line1 = plt.plot(epochs, val_acc_values, label='Validation/Test Accuracy')\n",
|
| 251 |
-
"line2 = plt.plot(epochs, acc_values, label='Training Accuracy')\n",
|
| 252 |
-
"plt.setp(line1, linewidth=2.0, marker = '+', markersize=10.0)\n",
|
| 253 |
-
"plt.setp(line2, linewidth=2.0, marker = '4', markersize=10.0)\n",
|
| 254 |
-
"plt.xlabel('Epochs') \n",
|
| 255 |
-
"plt.ylabel('Accuracy')\n",
|
| 256 |
-
"plt.grid(True)\n",
|
| 257 |
-
"plt.legend()\n",
|
| 258 |
-
"plt.show()"
|
| 259 |
-
]
|
| 260 |
-
},
|
| 261 |
-
{
|
| 262 |
-
"cell_type": "markdown",
|
| 263 |
-
"metadata": {},
|
| 264 |
-
"source": [
|
| 265 |
-
"#### Now let's display our Confusion Matrix and Classification Report"
|
| 266 |
-
]
|
| 267 |
-
},
|
| 268 |
-
{
|
| 269 |
-
"cell_type": "code",
|
| 270 |
-
"execution_count": 8,
|
| 271 |
-
"metadata": {},
|
| 272 |
-
"outputs": [
|
| 273 |
-
{
|
| 274 |
-
"name": "stdout",
|
| 275 |
-
"output_type": "stream",
|
| 276 |
-
"text": [
|
| 277 |
-
" precision recall f1-score support\n",
|
| 278 |
-
"\n",
|
| 279 |
-
" 0 0.94 0.98 0.96 980\n",
|
| 280 |
-
" 1 0.95 0.99 0.97 1135\n",
|
| 281 |
-
" 2 0.94 0.91 0.92 1032\n",
|
| 282 |
-
" 3 0.88 0.94 0.91 1010\n",
|
| 283 |
-
" 4 0.91 0.94 0.93 982\n",
|
| 284 |
-
" 5 0.96 0.85 0.90 892\n",
|
| 285 |
-
" 6 0.93 0.96 0.95 958\n",
|
| 286 |
-
" 7 0.90 0.94 0.92 1028\n",
|
| 287 |
-
" 8 0.92 0.88 0.90 974\n",
|
| 288 |
-
" 9 0.95 0.87 0.91 1009\n",
|
| 289 |
-
"\n",
|
| 290 |
-
" accuracy 0.93 10000\n",
|
| 291 |
-
" macro avg 0.93 0.93 0.93 10000\n",
|
| 292 |
-
"weighted avg 0.93 0.93 0.93 10000\n",
|
| 293 |
-
"\n",
|
| 294 |
-
"[[ 964 0 1 3 0 2 7 1 2 0]\n",
|
| 295 |
-
" [ 0 1120 4 2 0 1 3 0 5 0]\n",
|
| 296 |
-
" [ 12 4 940 14 10 0 11 19 21 1]\n",
|
| 297 |
-
" [ 1 2 11 950 1 5 0 18 17 5]\n",
|
| 298 |
-
" [ 1 4 6 0 925 1 18 3 2 22]\n",
|
| 299 |
-
" [ 18 3 4 60 10 754 16 4 19 4]\n",
|
| 300 |
-
" [ 13 3 4 2 3 9 920 1 3 0]\n",
|
| 301 |
-
" [ 2 21 20 2 9 0 0 963 3 8]\n",
|
| 302 |
-
" [ 7 10 8 33 10 11 11 24 856 4]\n",
|
| 303 |
-
" [ 11 9 3 18 49 3 1 39 3 873]]\n"
|
| 304 |
-
]
|
| 305 |
-
}
|
| 306 |
-
],
|
| 307 |
-
"source": [
|
| 308 |
-
"from sklearn.metrics import classification_report,confusion_matrix\n",
|
| 309 |
-
"import numpy as np\n",
|
| 310 |
-
"\n",
|
| 311 |
-
"y_pred = model.predict_classes(x_test)\n",
|
| 312 |
-
"\n",
|
| 313 |
-
"print(classification_report(np.argmax(y_test,axis=1), y_pred))\n",
|
| 314 |
-
"print(confusion_matrix(np.argmax(y_test,axis=1), y_pred))"
|
| 315 |
-
]
|
| 316 |
-
},
|
| 317 |
-
{
|
| 318 |
-
"cell_type": "markdown",
|
| 319 |
-
"metadata": {},
|
| 320 |
-
"source": [
|
| 321 |
-
"### Displaying our misclassified data"
|
| 322 |
-
]
|
| 323 |
-
},
|
| 324 |
-
{
|
| 325 |
-
"cell_type": "code",
|
| 326 |
-
"execution_count": 9,
|
| 327 |
-
"metadata": {},
|
| 328 |
-
"outputs": [
|
| 329 |
-
{
|
| 330 |
-
"name": "stdout",
|
| 331 |
-
"output_type": "stream",
|
| 332 |
-
"text": [
|
| 333 |
-
"Indices of misclassifed data are: \n",
|
| 334 |
-
"\n",
|
| 335 |
-
"(array([ 8, 33, 62, 66, 73, 77, 121, 124, 151, 193, 195,\n",
|
| 336 |
-
" 217, 233, 241, 247, 259, 290, 300, 313, 318, 320, 321,\n",
|
| 337 |
-
" 340, 341, 349, 352, 359, 362, 381, 403, 406, 412, 435,\n",
|
| 338 |
-
" 444, 445, 448, 464, 478, 479, 483, 495, 502, 507, 511,\n",
|
| 339 |
-
" 515, 528, 530, 543, 551, 565, 578, 582, 591, 606, 610,\n",
|
| 340 |
-
" 613, 619, 624, 628, 659, 667, 684, 689, 691, 707, 717,\n",
|
| 341 |
-
" 720, 728, 740, 791, 839, 844, 924, 939, 944, 947, 950,\n",
|
| 342 |
-
" 951, 956, 965, 975, 982, 992, 1003, 1014, 1032, 1033, 1039,\n",
|
| 343 |
-
" 1044, 1062, 1068, 1082, 1089, 1101, 1107, 1112, 1114, 1119, 1128,\n",
|
| 344 |
-
" 1152, 1181, 1192, 1198, 1200, 1204, 1206, 1224, 1226, 1228, 1232,\n",
|
| 345 |
-
" 1234, 1242, 1243, 1247, 1253, 1256, 1260, 1270, 1272, 1283, 1289,\n",
|
| 346 |
-
" 1299, 1319, 1326, 1378, 1393, 1402, 1409, 1423, 1433, 1440, 1453,\n",
|
| 347 |
-
" 1465, 1466, 1476, 1500, 1514, 1525, 1527, 1530, 1549, 1553, 1554,\n",
|
| 348 |
-
" 1559, 1581, 1587, 1601, 1609, 1621, 1634, 1640, 1678, 1681, 1709,\n",
|
| 349 |
-
" 1716, 1717, 1718, 1722, 1732, 1737, 1751, 1754, 1772, 1782, 1790,\n",
|
| 350 |
-
" 1819, 1850, 1874, 1878, 1899, 1901, 1917, 1930, 1938, 1940, 1948,\n",
|
| 351 |
-
" 1952, 1955, 1968, 1970, 1973, 1982, 1984, 2016, 2024, 2035, 2037,\n",
|
| 352 |
-
" 2040, 2043, 2044, 2052, 2053, 2068, 2070, 2093, 2098, 2099, 2109,\n",
|
| 353 |
-
" 2118, 2125, 2129, 2130, 2135, 2138, 2177, 2182, 2183, 2185, 2186,\n",
|
| 354 |
-
" 2189, 2192, 2208, 2215, 2224, 2232, 2266, 2272, 2293, 2299, 2325,\n",
|
| 355 |
-
" 2339, 2362, 2369, 2371, 2378, 2381, 2386, 2387, 2393, 2394, 2395,\n",
|
| 356 |
-
" 2404, 2406, 2408, 2414, 2422, 2425, 2447, 2460, 2488, 2515, 2526,\n",
|
| 357 |
-
" 2528, 2542, 2545, 2556, 2559, 2560, 2578, 2582, 2586, 2589, 2598,\n",
|
| 358 |
-
" 2604, 2607, 2610, 2631, 2635, 2648, 2654, 2670, 2695, 2698, 2730,\n",
|
| 359 |
-
" 2740, 2751, 2758, 2760, 2770, 2771, 2780, 2810, 2812, 2832, 2850,\n",
|
| 360 |
-
" 2863, 2896, 2905, 2914, 2925, 2927, 2930, 2945, 2953, 2970, 2986,\n",
|
| 361 |
-
" 2990, 2995, 3005, 3060, 3073, 3078, 3100, 3102, 3110, 3114, 3117,\n",
|
| 362 |
-
" 3130, 3132, 3133, 3136, 3139, 3145, 3157, 3167, 3189, 3206, 3225,\n",
|
| 363 |
-
" 3240, 3269, 3284, 3289, 3316, 3319, 3329, 3330, 3333, 3369, 3405,\n",
|
| 364 |
-
" 3406, 3414, 3436, 3475, 3503, 3520, 3549, 3552, 3558, 3565, 3567,\n",
|
| 365 |
-
" 3573, 3578, 3580, 3597, 3598, 3604, 3618, 3629, 3662, 3664, 3681,\n",
|
| 366 |
-
" 3687, 3702, 3709, 3716, 3718, 3726, 3732, 3751, 3757, 3763, 3767,\n",
|
| 367 |
-
" 3769, 3776, 3778, 3780, 3796, 3806, 3808, 3811, 3817, 3818, 3821,\n",
|
| 368 |
-
" 3833, 3836, 3838, 3846, 3848, 3853, 3855, 3862, 3869, 3876, 3893,\n",
|
| 369 |
-
" 3902, 3906, 3924, 3926, 3929, 3941, 3946, 3954, 3962, 3968, 3976,\n",
|
| 370 |
-
" 3984, 3985, 3998, 4000, 4031, 4063, 4065, 4068, 4072, 4075, 4076,\n",
|
| 371 |
-
" 4078, 4093, 4111, 4131, 4140, 4145, 4152, 4154, 4159, 4163, 4176,\n",
|
| 372 |
-
" 4201, 4205, 4211, 4212, 4224, 4238, 4248, 4255, 4265, 4271, 4272,\n",
|
| 373 |
-
" 4284, 4289, 4294, 4297, 4300, 4302, 4306, 4313, 4315, 4317, 4341,\n",
|
| 374 |
-
" 4355, 4360, 4374, 4380, 4405, 4425, 4433, 4435, 4449, 4451, 4454,\n",
|
| 375 |
-
" 4477, 4497, 4498, 4500, 4505, 4521, 4523, 4540, 4548, 4567, 4571,\n",
|
| 376 |
-
" 4575, 4601, 4615, 4633, 4639, 4640, 4662, 4671, 4724, 4731, 4735,\n",
|
| 377 |
-
" 4751, 4785, 4807, 4808, 4814, 4823, 4828, 4829, 4837, 4863, 4874,\n",
|
| 378 |
-
" 4876, 4879, 4880, 4886, 4890, 4910, 4943, 4950, 4952, 4956, 4966,\n",
|
| 379 |
-
" 4978, 4990, 5001, 5009, 5015, 5065, 5067, 5068, 5100, 5135, 5140,\n",
|
| 380 |
-
" 5210, 5217, 5246, 5299, 5311, 5331, 5360, 5457, 5522, 5562, 5600,\n",
|
| 381 |
-
" 5601, 5611, 5634, 5642, 5677, 5734, 5735, 5749, 5757, 5821, 5842,\n",
|
| 382 |
-
" 5852, 5862, 5867, 5874, 5887, 5888, 5891, 5913, 5922, 5936, 5937,\n",
|
| 383 |
-
" 5955, 5957, 5972, 5973, 5981, 5982, 5985, 6035, 6042, 6043, 6059,\n",
|
| 384 |
-
" 6071, 6081, 6091, 6112, 6124, 6157, 6166, 6168, 6172, 6173, 6304,\n",
|
| 385 |
-
" 6347, 6385, 6400, 6421, 6425, 6426, 6505, 6517, 6555, 6560, 6568,\n",
|
| 386 |
-
" 6569, 6571, 6574, 6576, 6577, 6597, 6598, 6603, 6625, 6641, 6642,\n",
|
| 387 |
-
" 6651, 6662, 6706, 6721, 6725, 6740, 6744, 6746, 6755, 6765, 6775,\n",
|
| 388 |
-
" 6784, 6785, 6793, 6817, 6870, 6872, 6894, 6895, 6906, 6919, 6926,\n",
|
| 389 |
-
" 7003, 7035, 7043, 7094, 7121, 7130, 7198, 7212, 7235, 7372, 7432,\n",
|
| 390 |
-
" 7434, 7451, 7459, 7473, 7492, 7498, 7579, 7580, 7637, 7672, 7673,\n",
|
| 391 |
-
" 7756, 7777, 7779, 7786, 7797, 7821, 7823, 7849, 7859, 7886, 7888,\n",
|
| 392 |
-
" 7905, 7918, 7921, 7945, 7991, 8020, 8044, 8062, 8072, 8081, 8091,\n",
|
| 393 |
-
" 8094, 8095, 8165, 8183, 8196, 8198, 8246, 8279, 8332, 8339, 8408,\n",
|
| 394 |
-
" 8410, 8426, 8431, 8457, 8477, 8520, 8522, 8530, 8639, 8912, 9007,\n",
|
| 395 |
-
" 9009, 9010, 9015, 9016, 9019, 9024, 9026, 9036, 9045, 9168, 9211,\n",
|
| 396 |
-
" 9245, 9280, 9316, 9422, 9427, 9433, 9446, 9456, 9465, 9482, 9530,\n",
|
| 397 |
-
" 9554, 9560, 9587, 9610, 9624, 9634, 9642, 9664, 9679, 9680, 9692,\n",
|
| 398 |
-
" 9698, 9700, 9712, 9716, 9719, 9726, 9729, 9740, 9741, 9744, 9745,\n",
|
| 399 |
-
" 9749, 9751, 9752, 9755, 9768, 9770, 9777, 9779, 9780, 9792, 9808,\n",
|
| 400 |
-
" 9811, 9832, 9839, 9858, 9867, 9874, 9883, 9888, 9890, 9892, 9893,\n",
|
| 401 |
-
" 9905, 9925, 9941, 9944, 9959, 9970, 9975, 9980, 9982], dtype=int64),)\n"
|
| 402 |
-
]
|
| 403 |
-
}
|
| 404 |
-
],
|
| 405 |
-
"source": [
|
| 406 |
-
"import cv2\n",
|
| 407 |
-
"import numpy as np\n",
|
| 408 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 409 |
-
"\n",
|
| 410 |
-
"# loads the MNIST dataset\n",
|
| 411 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 412 |
-
"\n",
|
| 413 |
-
"# Use numpy to create an array that stores a value of 1 when a misclassification occurs\n",
|
| 414 |
-
"result = np.absolute(y_test - y_pred)\n",
|
| 415 |
-
"result_indices = np.nonzero(result > 0)\n",
|
| 416 |
-
"\n",
|
| 417 |
-
"# Display the indices of mislassifications\n",
|
| 418 |
-
"print(\"Indices of misclassifed data are: \\n\\n\" + str(result_indices))"
|
| 419 |
-
]
|
| 420 |
-
},
|
| 421 |
-
{
|
| 422 |
-
"cell_type": "markdown",
|
| 423 |
-
"metadata": {},
|
| 424 |
-
"source": [
|
| 425 |
-
"### Displaying the misclassifications"
|
| 426 |
-
]
|
| 427 |
-
},
|
| 428 |
-
{
|
| 429 |
-
"cell_type": "code",
|
| 430 |
-
"execution_count": 10,
|
| 431 |
-
"metadata": {},
|
| 432 |
-
"outputs": [],
|
| 433 |
-
"source": [
|
| 434 |
-
"import cv2 \n",
|
| 435 |
-
"#from keras.models import load_model\n",
|
| 436 |
-
"\n",
|
| 437 |
-
"#classifier = load_model('/home/deeplearningcv/DeepLearningCV/Trained Models/mnist_simple_cnn.h5')\n",
|
| 438 |
-
"\n",
|
| 439 |
-
"def draw_test(name, pred, input_im, true_label):\n",
|
| 440 |
-
" BLACK = [0,0,0]\n",
|
| 441 |
-
" expanded_image = cv2.copyMakeBorder(input_im, 0, 0, 0, imageL.shape[0]*2 ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 442 |
-
" expanded_image = cv2.cvtColor(expanded_image, cv2.COLOR_GRAY2BGR)\n",
|
| 443 |
-
" cv2.putText(expanded_image, str(pred), (152, 70) , cv2.FONT_HERSHEY_COMPLEX_SMALL,4, (0,255,0), 2)\n",
|
| 444 |
-
" cv2.putText(expanded_image, str(true_label), (250, 70) , cv2.FONT_HERSHEY_COMPLEX_SMALL,4, (0,0,255), 2)\n",
|
| 445 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 446 |
-
"\n",
|
| 447 |
-
"for i in range(0,10):\n",
|
| 448 |
-
"\n",
|
| 449 |
-
" input_im = x_test[result_indices[0][i]]\n",
|
| 450 |
-
" #print(y_test[result_indices[0][i]])\n",
|
| 451 |
-
" imageL = cv2.resize(input_im, None, fx=4, fy=4, interpolation = cv2.INTER_CUBIC) \n",
|
| 452 |
-
" input_im = input_im.reshape(1,28,28,1) \n",
|
| 453 |
-
" \n",
|
| 454 |
-
" ## Get Prediction\n",
|
| 455 |
-
" res = str(model.predict_classes(input_im, 1, verbose = 0)[0])\n",
|
| 456 |
-
" draw_test(\"Prediction\", res, imageL, y_test[result_indices[0][i]]) \n",
|
| 457 |
-
" cv2.waitKey(0)\n",
|
| 458 |
-
"\n",
|
| 459 |
-
"cv2.destroyAllWindows()"
|
| 460 |
-
]
|
| 461 |
-
}
|
| 462 |
-
],
|
| 463 |
-
"metadata": {
|
| 464 |
-
"kernelspec": {
|
| 465 |
-
"display_name": "Python 3",
|
| 466 |
-
"language": "python",
|
| 467 |
-
"name": "python3"
|
| 468 |
-
},
|
| 469 |
-
"language_info": {
|
| 470 |
-
"codemirror_mode": {
|
| 471 |
-
"name": "ipython",
|
| 472 |
-
"version": 3
|
| 473 |
-
},
|
| 474 |
-
"file_extension": ".py",
|
| 475 |
-
"mimetype": "text/x-python",
|
| 476 |
-
"name": "python",
|
| 477 |
-
"nbconvert_exporter": "python",
|
| 478 |
-
"pygments_lexer": "ipython3",
|
| 479 |
-
"version": "3.7.4"
|
| 480 |
-
}
|
| 481 |
-
},
|
| 482 |
-
"nbformat": 4,
|
| 483 |
-
"nbformat_minor": 2
|
| 484 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:c47f97002861e52ea07c9f8fe4532c4528ef53fa510b089837a705770ca7d5d5
|
| 3 |
+
size 45598
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
12. Optimizers, Adaptive Learning Rate & Callbacks/12.2 Checkpointing Models and Early Stopping.ipynb
CHANGED
|
@@ -1,277 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "code",
|
| 5 |
-
"execution_count": 1,
|
| 6 |
-
"metadata": {},
|
| 7 |
-
"outputs": [
|
| 8 |
-
{
|
| 9 |
-
"name": "stdout",
|
| 10 |
-
"output_type": "stream",
|
| 11 |
-
"text": [
|
| 12 |
-
"x_train shape: (60000, 28, 28, 1)\n",
|
| 13 |
-
"60000 train samples\n",
|
| 14 |
-
"10000 test samples\n",
|
| 15 |
-
"Number of Classes: 10\n",
|
| 16 |
-
"Model: \"sequential\"\n",
|
| 17 |
-
"_________________________________________________________________\n",
|
| 18 |
-
"Layer (type) Output Shape Param # \n",
|
| 19 |
-
"=================================================================\n",
|
| 20 |
-
"conv2d (Conv2D) (None, 26, 26, 32) 320 \n",
|
| 21 |
-
"_________________________________________________________________\n",
|
| 22 |
-
"conv2d_1 (Conv2D) (None, 24, 24, 64) 18496 \n",
|
| 23 |
-
"_________________________________________________________________\n",
|
| 24 |
-
"max_pooling2d (MaxPooling2D) (None, 12, 12, 64) 0 \n",
|
| 25 |
-
"_________________________________________________________________\n",
|
| 26 |
-
"dropout (Dropout) (None, 12, 12, 64) 0 \n",
|
| 27 |
-
"_________________________________________________________________\n",
|
| 28 |
-
"flatten (Flatten) (None, 9216) 0 \n",
|
| 29 |
-
"_________________________________________________________________\n",
|
| 30 |
-
"dense (Dense) (None, 128) 1179776 \n",
|
| 31 |
-
"_________________________________________________________________\n",
|
| 32 |
-
"dropout_1 (Dropout) (None, 128) 0 \n",
|
| 33 |
-
"_________________________________________________________________\n",
|
| 34 |
-
"dense_1 (Dense) (None, 10) 1290 \n",
|
| 35 |
-
"=================================================================\n",
|
| 36 |
-
"Total params: 1,199,882\n",
|
| 37 |
-
"Trainable params: 1,199,882\n",
|
| 38 |
-
"Non-trainable params: 0\n",
|
| 39 |
-
"_________________________________________________________________\n",
|
| 40 |
-
"None\n"
|
| 41 |
-
]
|
| 42 |
-
}
|
| 43 |
-
],
|
| 44 |
-
"source": [
|
| 45 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 46 |
-
"from tensorflow.keras.utils import to_categorical\n",
|
| 47 |
-
"import tensorflow as tf\n",
|
| 48 |
-
"from tensorflow.keras.optimizers import SGD \n",
|
| 49 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 50 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 51 |
-
"from tensorflow.keras.layers import Dense, Dropout, Flatten\n",
|
| 52 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
|
| 53 |
-
"from tensorflow.keras import backend as K\n",
|
| 54 |
-
"from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping\n",
|
| 55 |
-
"import os\n",
|
| 56 |
-
"\n",
|
| 57 |
-
"# Training Parameters\n",
|
| 58 |
-
"batch_size = 64\n",
|
| 59 |
-
"epochs = 15\n",
|
| 60 |
-
"\n",
|
| 61 |
-
"# loads the MNIST dataset\n",
|
| 62 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 63 |
-
"\n",
|
| 64 |
-
"# Lets store the number of rows and columns\n",
|
| 65 |
-
"img_rows = x_train[0].shape[0]\n",
|
| 66 |
-
"img_cols = x_train[1].shape[0]\n",
|
| 67 |
-
"\n",
|
| 68 |
-
"# Getting our date in the right 'shape' needed for Keras\n",
|
| 69 |
-
"# We need to add a 4th dimenion to our date thereby changing our\n",
|
| 70 |
-
"# Our original image shape of (60000,28,28) to (60000,28,28,1)\n",
|
| 71 |
-
"x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n",
|
| 72 |
-
"x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n",
|
| 73 |
-
"\n",
|
| 74 |
-
"# store the shape of a single image \n",
|
| 75 |
-
"input_shape = (img_rows, img_cols, 1)\n",
|
| 76 |
-
"\n",
|
| 77 |
-
"# change our image type to float32 data type\n",
|
| 78 |
-
"x_train = x_train.astype('float32')\n",
|
| 79 |
-
"x_test = x_test.astype('float32')\n",
|
| 80 |
-
"\n",
|
| 81 |
-
"# Normalize our data by changing the range from (0 to 255) to (0 to 1)\n",
|
| 82 |
-
"x_train /= 255\n",
|
| 83 |
-
"x_test /= 255\n",
|
| 84 |
-
"\n",
|
| 85 |
-
"print('x_train shape:', x_train.shape)\n",
|
| 86 |
-
"print(x_train.shape[0], 'train samples')\n",
|
| 87 |
-
"print(x_test.shape[0], 'test samples')\n",
|
| 88 |
-
"\n",
|
| 89 |
-
"# Now we one hot encode outputs\n",
|
| 90 |
-
"y_train = to_categorical(y_train)\n",
|
| 91 |
-
"y_test = to_categorical(y_test)\n",
|
| 92 |
-
"\n",
|
| 93 |
-
"# Let's count the number columns in our hot encoded matrix \n",
|
| 94 |
-
"print (\"Number of Classes: \" + str(y_test.shape[1]))\n",
|
| 95 |
-
"\n",
|
| 96 |
-
"num_classes = y_test.shape[1]\n",
|
| 97 |
-
"num_pixels = x_train.shape[1] * x_train.shape[2]\n",
|
| 98 |
-
"\n",
|
| 99 |
-
"# create model\n",
|
| 100 |
-
"model = Sequential()\n",
|
| 101 |
-
"\n",
|
| 102 |
-
"model.add(Conv2D(32, kernel_size=(3, 3),\n",
|
| 103 |
-
" activation='relu',\n",
|
| 104 |
-
" input_shape=input_shape))\n",
|
| 105 |
-
"model.add(Conv2D(64, (3, 3), activation='relu'))\n",
|
| 106 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 107 |
-
"model.add(Dropout(0.25))\n",
|
| 108 |
-
"model.add(Flatten())\n",
|
| 109 |
-
"model.add(Dense(128, activation='relu'))\n",
|
| 110 |
-
"model.add(Dropout(0.5))\n",
|
| 111 |
-
"model.add(Dense(num_classes, activation='softmax'))\n",
|
| 112 |
-
"\n",
|
| 113 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 114 |
-
" optimizer = SGD(0.01),\n",
|
| 115 |
-
" metrics = ['accuracy'])\n",
|
| 116 |
-
"\n",
|
| 117 |
-
"print(model.summary())\n",
|
| 118 |
-
"\n",
|
| 119 |
-
" \n",
|
| 120 |
-
"checkpoint = ModelCheckpoint(\"MNIST_Checkpoint.h5\",\n",
|
| 121 |
-
" monitor=\"val_loss\",\n",
|
| 122 |
-
" mode=\"min\",\n",
|
| 123 |
-
" save_best_only = True,\n",
|
| 124 |
-
" verbose=1)\n",
|
| 125 |
-
"callbacks = [checkpoint]\n"
|
| 126 |
-
]
|
| 127 |
-
},
|
| 128 |
-
{
|
| 129 |
-
"cell_type": "code",
|
| 130 |
-
"execution_count": 2,
|
| 131 |
-
"metadata": {},
|
| 132 |
-
"outputs": [
|
| 133 |
-
{
|
| 134 |
-
"name": "stdout",
|
| 135 |
-
"output_type": "stream",
|
| 136 |
-
"text": [
|
| 137 |
-
"Train on 60000 samples, validate on 10000 samples\n",
|
| 138 |
-
"\n",
|
| 139 |
-
"Epoch 00001: val_loss improved from inf to 0.24715, saving model to MNIST_Checkpoint.h5\n",
|
| 140 |
-
"60000/60000 - 97s - loss: 0.7167 - accuracy: 0.7759 - val_loss: 0.2471 - val_accuracy: 0.9275\n",
|
| 141 |
-
"Test loss: 0.24714763118624689\n",
|
| 142 |
-
"Test accuracy: 0.9275\n"
|
| 143 |
-
]
|
| 144 |
-
}
|
| 145 |
-
],
|
| 146 |
-
"source": [
|
| 147 |
-
"history = model.fit(x_train, y_train,\n",
|
| 148 |
-
" batch_size = batch_size,\n",
|
| 149 |
-
" epochs = epochs,\n",
|
| 150 |
-
" verbose = 2,\n",
|
| 151 |
-
" callbacks = callbacks,\n",
|
| 152 |
-
" validation_data = (x_test, y_test))\n",
|
| 153 |
-
"\n",
|
| 154 |
-
"score = model.evaluate(x_test, y_test, verbose=0)\n",
|
| 155 |
-
"print('Test loss:', score[0])\n",
|
| 156 |
-
"print('Test accuracy:', score[1])"
|
| 157 |
-
]
|
| 158 |
-
},
|
| 159 |
-
{
|
| 160 |
-
"cell_type": "markdown",
|
| 161 |
-
"metadata": {},
|
| 162 |
-
"source": [
|
| 163 |
-
"### Adding Multiple Call Backs & Early Stopping\n",
|
| 164 |
-
"\n",
|
| 165 |
-
"We can use other call back methods to monitor our training process such as **Early Stopping**. Checkout the Keras documentation for more:\n",
|
| 166 |
-
"- https://keras.io/callbacks/"
|
| 167 |
-
]
|
| 168 |
-
},
|
| 169 |
-
{
|
| 170 |
-
"cell_type": "code",
|
| 171 |
-
"execution_count": 4,
|
| 172 |
-
"metadata": {},
|
| 173 |
-
"outputs": [],
|
| 174 |
-
"source": [
|
| 175 |
-
"from tensorflow.keras.callbacks import EarlyStopping\n",
|
| 176 |
-
"\n",
|
| 177 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', # value being monitored for improvement\n",
|
| 178 |
-
" min_delta = 0, #Abs value and is the min change required before we stop\n",
|
| 179 |
-
" patience = 3, #Number of epochs we wait before stopping \n",
|
| 180 |
-
" verbose = 1,\n",
|
| 181 |
-
" restore_best_weights = True) #keeps the best weigths once stopped\n",
|
| 182 |
-
"\n",
|
| 183 |
-
"# we put our call backs into a callback list\n",
|
| 184 |
-
"callbacks = [earlystop, checkpoint]"
|
| 185 |
-
]
|
| 186 |
-
},
|
| 187 |
-
{
|
| 188 |
-
"cell_type": "markdown",
|
| 189 |
-
"metadata": {},
|
| 190 |
-
"source": [
|
| 191 |
-
"### We can attempt to run again to see if it worked!"
|
| 192 |
-
]
|
| 193 |
-
},
|
| 194 |
-
{
|
| 195 |
-
"cell_type": "code",
|
| 196 |
-
"execution_count": 5,
|
| 197 |
-
"metadata": {},
|
| 198 |
-
"outputs": [
|
| 199 |
-
{
|
| 200 |
-
"name": "stdout",
|
| 201 |
-
"output_type": "stream",
|
| 202 |
-
"text": [
|
| 203 |
-
"Train on 60000 samples, validate on 10000 samples\n",
|
| 204 |
-
"Epoch 1/3\n",
|
| 205 |
-
"\n",
|
| 206 |
-
"Epoch 00001: val_loss improved from 0.24715 to 0.18733, saving model to MNIST_Checkpoint.h5\n",
|
| 207 |
-
"60000/60000 - 105s - loss: 0.3593 - accuracy: 0.8905 - val_loss: 0.1873 - val_accuracy: 0.9437\n",
|
| 208 |
-
"Epoch 2/3\n",
|
| 209 |
-
"\n",
|
| 210 |
-
"Epoch 00002: val_loss improved from 0.18733 to 0.15683, saving model to MNIST_Checkpoint.h5\n",
|
| 211 |
-
"60000/60000 - 105s - loss: 0.3018 - accuracy: 0.9084 - val_loss: 0.1568 - val_accuracy: 0.9525\n",
|
| 212 |
-
"Epoch 3/3\n",
|
| 213 |
-
"\n",
|
| 214 |
-
"Epoch 00003: val_loss improved from 0.15683 to 0.13865, saving model to MNIST_Checkpoint.h5\n",
|
| 215 |
-
"60000/60000 - 108s - loss: 0.2658 - accuracy: 0.9205 - val_loss: 0.1386 - val_accuracy: 0.9578\n",
|
| 216 |
-
"Test loss: 0.1386499687358737\n",
|
| 217 |
-
"Test accuracy: 0.9578\n"
|
| 218 |
-
]
|
| 219 |
-
}
|
| 220 |
-
],
|
| 221 |
-
"source": [
|
| 222 |
-
"history = model.fit(x_train, y_train,\n",
|
| 223 |
-
" batch_size=64,\n",
|
| 224 |
-
" epochs=3,\n",
|
| 225 |
-
" verbose=2,\n",
|
| 226 |
-
" callbacks = callbacks,\n",
|
| 227 |
-
" validation_data=(x_test, y_test))\n",
|
| 228 |
-
"\n",
|
| 229 |
-
"\n",
|
| 230 |
-
"score = model.evaluate(x_test, y_test, verbose=0)\n",
|
| 231 |
-
"print('Test loss:', score[0])\n",
|
| 232 |
-
"print('Test accuracy:', score[1])"
|
| 233 |
-
]
|
| 234 |
-
},
|
| 235 |
-
{
|
| 236 |
-
"cell_type": "markdown",
|
| 237 |
-
"metadata": {},
|
| 238 |
-
"source": [
|
| 239 |
-
"### Another useful callback is Reducing our learning Rate on Plateau\n",
|
| 240 |
-
"\n",
|
| 241 |
-
"We can avoid having our oscillate around the global minimum by attempting to reduce the Learn Rate by a certain fact. If no improvement is seen in our monitored metric (val_loss typically), we wait a certain number of epochs (patience) then this callback reduces the learning rate by a factor"
|
| 242 |
-
]
|
| 243 |
-
},
|
| 244 |
-
{
|
| 245 |
-
"cell_type": "code",
|
| 246 |
-
"execution_count": 6,
|
| 247 |
-
"metadata": {},
|
| 248 |
-
"outputs": [],
|
| 249 |
-
"source": [
|
| 250 |
-
"from tensorflow.keras.callbacks import ReduceLROnPlateau\n",
|
| 251 |
-
"\n",
|
| 252 |
-
"reduce_lr = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.2, patience = 3, verbose = 1, min_delta = 0.0001)"
|
| 253 |
-
]
|
| 254 |
-
}
|
| 255 |
-
],
|
| 256 |
-
"metadata": {
|
| 257 |
-
"kernelspec": {
|
| 258 |
-
"display_name": "Python 3",
|
| 259 |
-
"language": "python",
|
| 260 |
-
"name": "python3"
|
| 261 |
-
},
|
| 262 |
-
"language_info": {
|
| 263 |
-
"codemirror_mode": {
|
| 264 |
-
"name": "ipython",
|
| 265 |
-
"version": 3
|
| 266 |
-
},
|
| 267 |
-
"file_extension": ".py",
|
| 268 |
-
"mimetype": "text/x-python",
|
| 269 |
-
"name": "python",
|
| 270 |
-
"nbconvert_exporter": "python",
|
| 271 |
-
"pygments_lexer": "ipython3",
|
| 272 |
-
"version": "3.7.4"
|
| 273 |
-
}
|
| 274 |
-
},
|
| 275 |
-
"nbformat": 4,
|
| 276 |
-
"nbformat_minor": 2
|
| 277 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:ce5402efc021bce040ca722a277c1a8559913d052ac4f9255dd67b530ea6a77f
|
| 3 |
+
size 10213
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
12. Optimizers, Adaptive Learning Rate & Callbacks/12.3 Building a Fruit Classifer.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
13. Building LeNet and AlexNet in Keras/13.1 Built LeNet and test on MNIST.ipynb
CHANGED
|
@@ -1,209 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Let's construct LeNet in Keras!\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"#### First let's load and prep our MNIST data"
|
| 10 |
-
]
|
| 11 |
-
},
|
| 12 |
-
{
|
| 13 |
-
"cell_type": "code",
|
| 14 |
-
"execution_count": 2,
|
| 15 |
-
"metadata": {},
|
| 16 |
-
"outputs": [],
|
| 17 |
-
"source": [
|
| 18 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 19 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 20 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten\n",
|
| 21 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D, ZeroPadding2D\n",
|
| 22 |
-
"from tensorflow.keras.layers import BatchNormalization\n",
|
| 23 |
-
"from tensorflow.keras.regularizers import l2\n",
|
| 24 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 25 |
-
"from tensorflow.keras.utils import to_categorical\n",
|
| 26 |
-
"import tensorflow as tf\n",
|
| 27 |
-
"\n",
|
| 28 |
-
"# loads the MNIST dataset\n",
|
| 29 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 30 |
-
"\n",
|
| 31 |
-
"# Lets store the number of rows and columns\n",
|
| 32 |
-
"img_rows = x_train[0].shape[0]\n",
|
| 33 |
-
"img_cols = x_train[1].shape[0]\n",
|
| 34 |
-
"\n",
|
| 35 |
-
"# Getting our date in the right 'shape' needed for Keras\n",
|
| 36 |
-
"# We need to add a 4th dimenion to our date thereby changing our\n",
|
| 37 |
-
"# Our original image shape of (60000,28,28) to (60000,28,28,1)\n",
|
| 38 |
-
"x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n",
|
| 39 |
-
"x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n",
|
| 40 |
-
"\n",
|
| 41 |
-
"# store the shape of a single image \n",
|
| 42 |
-
"input_shape = (img_rows, img_cols, 1)\n",
|
| 43 |
-
"\n",
|
| 44 |
-
"# change our image type to float32 data type\n",
|
| 45 |
-
"x_train = x_train.astype('float32')\n",
|
| 46 |
-
"x_test = x_test.astype('float32')\n",
|
| 47 |
-
"\n",
|
| 48 |
-
"# Normalize our data by changing the range from (0 to 255) to (0 to 1)\n",
|
| 49 |
-
"x_train /= 255\n",
|
| 50 |
-
"x_test /= 255\n",
|
| 51 |
-
"\n",
|
| 52 |
-
"# Now we one hot encode outputs\n",
|
| 53 |
-
"y_train = to_categorical(y_train)\n",
|
| 54 |
-
"y_test = to_categorical(y_test)\n",
|
| 55 |
-
"\n",
|
| 56 |
-
"num_classes = y_test.shape[1]\n",
|
| 57 |
-
"num_pixels = x_train.shape[1] * x_train.shape[2]"
|
| 58 |
-
]
|
| 59 |
-
},
|
| 60 |
-
{
|
| 61 |
-
"cell_type": "markdown",
|
| 62 |
-
"metadata": {},
|
| 63 |
-
"source": [
|
| 64 |
-
"### Now let's create our layers to replicate LeNet"
|
| 65 |
-
]
|
| 66 |
-
},
|
| 67 |
-
{
|
| 68 |
-
"cell_type": "code",
|
| 69 |
-
"execution_count": 4,
|
| 70 |
-
"metadata": {},
|
| 71 |
-
"outputs": [
|
| 72 |
-
{
|
| 73 |
-
"name": "stdout",
|
| 74 |
-
"output_type": "stream",
|
| 75 |
-
"text": [
|
| 76 |
-
"Model: \"sequential\"\n",
|
| 77 |
-
"_________________________________________________________________\n",
|
| 78 |
-
"Layer (type) Output Shape Param # \n",
|
| 79 |
-
"=================================================================\n",
|
| 80 |
-
"conv2d (Conv2D) (None, 28, 28, 20) 520 \n",
|
| 81 |
-
"_________________________________________________________________\n",
|
| 82 |
-
"activation (Activation) (None, 28, 28, 20) 0 \n",
|
| 83 |
-
"_________________________________________________________________\n",
|
| 84 |
-
"max_pooling2d (MaxPooling2D) (None, 14, 14, 20) 0 \n",
|
| 85 |
-
"_________________________________________________________________\n",
|
| 86 |
-
"conv2d_1 (Conv2D) (None, 14, 14, 50) 25050 \n",
|
| 87 |
-
"_________________________________________________________________\n",
|
| 88 |
-
"activation_1 (Activation) (None, 14, 14, 50) 0 \n",
|
| 89 |
-
"_________________________________________________________________\n",
|
| 90 |
-
"max_pooling2d_1 (MaxPooling2 (None, 7, 7, 50) 0 \n",
|
| 91 |
-
"_________________________________________________________________\n",
|
| 92 |
-
"flatten (Flatten) (None, 2450) 0 \n",
|
| 93 |
-
"_________________________________________________________________\n",
|
| 94 |
-
"dense (Dense) (None, 500) 1225500 \n",
|
| 95 |
-
"_________________________________________________________________\n",
|
| 96 |
-
"activation_2 (Activation) (None, 500) 0 \n",
|
| 97 |
-
"_________________________________________________________________\n",
|
| 98 |
-
"dense_1 (Dense) (None, 10) 5010 \n",
|
| 99 |
-
"_________________________________________________________________\n",
|
| 100 |
-
"activation_3 (Activation) (None, 10) 0 \n",
|
| 101 |
-
"=================================================================\n",
|
| 102 |
-
"Total params: 1,256,080\n",
|
| 103 |
-
"Trainable params: 1,256,080\n",
|
| 104 |
-
"Non-trainable params: 0\n",
|
| 105 |
-
"_________________________________________________________________\n",
|
| 106 |
-
"None\n"
|
| 107 |
-
]
|
| 108 |
-
}
|
| 109 |
-
],
|
| 110 |
-
"source": [
|
| 111 |
-
"# create model\n",
|
| 112 |
-
"model = Sequential()\n",
|
| 113 |
-
"\n",
|
| 114 |
-
"# 2 sets of CRP (Convolution, RELU, Pooling)\n",
|
| 115 |
-
"model.add(Conv2D(20, (5, 5),\n",
|
| 116 |
-
" padding = \"same\", \n",
|
| 117 |
-
" input_shape = input_shape))\n",
|
| 118 |
-
"model.add(Activation(\"relu\"))\n",
|
| 119 |
-
"model.add(MaxPooling2D(pool_size = (2, 2), strides = (2, 2)))\n",
|
| 120 |
-
"\n",
|
| 121 |
-
"model.add(Conv2D(50, (5, 5),\n",
|
| 122 |
-
" padding = \"same\"))\n",
|
| 123 |
-
"model.add(Activation(\"relu\"))\n",
|
| 124 |
-
"model.add(MaxPooling2D(pool_size = (2, 2), strides = (2, 2)))\n",
|
| 125 |
-
"\n",
|
| 126 |
-
"# Fully connected layers (w/ RELU)\n",
|
| 127 |
-
"model.add(Flatten())\n",
|
| 128 |
-
"model.add(Dense(500))\n",
|
| 129 |
-
"model.add(Activation(\"relu\"))\n",
|
| 130 |
-
"\n",
|
| 131 |
-
"# Softmax (for classification)\n",
|
| 132 |
-
"model.add(Dense(num_classes))\n",
|
| 133 |
-
"model.add(Activation(\"softmax\"))\n",
|
| 134 |
-
" \n",
|
| 135 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 136 |
-
" optimizer = tf.keras.optimizers.Adadelta(),\n",
|
| 137 |
-
" metrics = ['accuracy'])\n",
|
| 138 |
-
" \n",
|
| 139 |
-
"print(model.summary())"
|
| 140 |
-
]
|
| 141 |
-
},
|
| 142 |
-
{
|
| 143 |
-
"cell_type": "markdown",
|
| 144 |
-
"metadata": {},
|
| 145 |
-
"source": [
|
| 146 |
-
"### Now let us train LeNet on our MNIST Dataset"
|
| 147 |
-
]
|
| 148 |
-
},
|
| 149 |
-
{
|
| 150 |
-
"cell_type": "code",
|
| 151 |
-
"execution_count": 5,
|
| 152 |
-
"metadata": {
|
| 153 |
-
"scrolled": true
|
| 154 |
-
},
|
| 155 |
-
"outputs": [
|
| 156 |
-
{
|
| 157 |
-
"name": "stdout",
|
| 158 |
-
"output_type": "stream",
|
| 159 |
-
"text": [
|
| 160 |
-
"Train on 60000 samples, validate on 10000 samples\n",
|
| 161 |
-
"60000/60000 [==============================] - 82s 1ms/sample - loss: 2.2876 - accuracy: 0.1511 - val_loss: 2.2647 - val_accuracy: 0.2432\n",
|
| 162 |
-
"10000/10000 [==============================] - 4s 436us/sample - loss: 2.2647 - accuracy: 0.2432\n",
|
| 163 |
-
"Test loss: 2.264678302001953\n",
|
| 164 |
-
"Test accuracy: 0.2432\n"
|
| 165 |
-
]
|
| 166 |
-
}
|
| 167 |
-
],
|
| 168 |
-
"source": [
|
| 169 |
-
"# Training Parameters\n",
|
| 170 |
-
"batch_size = 128\n",
|
| 171 |
-
"epochs = 1\n",
|
| 172 |
-
"\n",
|
| 173 |
-
"history = model.fit(x_train, y_train,\n",
|
| 174 |
-
" batch_size=batch_size,\n",
|
| 175 |
-
" epochs=epochs,\n",
|
| 176 |
-
" validation_data=(x_test, y_test),\n",
|
| 177 |
-
" shuffle=True)\n",
|
| 178 |
-
"\n",
|
| 179 |
-
"model.save(\"mnist_LeNet.h5\")\n",
|
| 180 |
-
"\n",
|
| 181 |
-
"# Evaluate the performance of our trained model\n",
|
| 182 |
-
"scores = model.evaluate(x_test, y_test, verbose=1)\n",
|
| 183 |
-
"print('Test loss:', scores[0])\n",
|
| 184 |
-
"print('Test accuracy:', scores[1])"
|
| 185 |
-
]
|
| 186 |
-
}
|
| 187 |
-
],
|
| 188 |
-
"metadata": {
|
| 189 |
-
"kernelspec": {
|
| 190 |
-
"display_name": "Python 3",
|
| 191 |
-
"language": "python",
|
| 192 |
-
"name": "python3"
|
| 193 |
-
},
|
| 194 |
-
"language_info": {
|
| 195 |
-
"codemirror_mode": {
|
| 196 |
-
"name": "ipython",
|
| 197 |
-
"version": 3
|
| 198 |
-
},
|
| 199 |
-
"file_extension": ".py",
|
| 200 |
-
"mimetype": "text/x-python",
|
| 201 |
-
"name": "python",
|
| 202 |
-
"nbconvert_exporter": "python",
|
| 203 |
-
"pygments_lexer": "ipython3",
|
| 204 |
-
"version": "3.7.4"
|
| 205 |
-
}
|
| 206 |
-
},
|
| 207 |
-
"nbformat": 4,
|
| 208 |
-
"nbformat_minor": 2
|
| 209 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b84d69d00861f3c9f01714ccc40b68a765d5e319bdd51f1ef3df0d6a776c4dd1
|
| 3 |
+
size 7458
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13. Building LeNet and AlexNet in Keras/13.2 Build AlexNet and test on CIFAR10.ipynb
CHANGED
|
@@ -1,266 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Let's construct AlexNet in Keras!\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"#### First let's load and prep our CIFAR10 data"
|
| 10 |
-
]
|
| 11 |
-
},
|
| 12 |
-
{
|
| 13 |
-
"cell_type": "code",
|
| 14 |
-
"execution_count": 2,
|
| 15 |
-
"metadata": {},
|
| 16 |
-
"outputs": [
|
| 17 |
-
{
|
| 18 |
-
"name": "stdout",
|
| 19 |
-
"output_type": "stream",
|
| 20 |
-
"text": [
|
| 21 |
-
"x_train shape: (50000, 32, 32, 3)\n",
|
| 22 |
-
"50000 train samples\n",
|
| 23 |
-
"10000 test samples\n"
|
| 24 |
-
]
|
| 25 |
-
}
|
| 26 |
-
],
|
| 27 |
-
"source": [
|
| 28 |
-
"from __future__ import print_function\n",
|
| 29 |
-
"import tensorflow as tf\n",
|
| 30 |
-
"from tensorflow.keras.datasets import cifar10\n",
|
| 31 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 32 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 33 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten\n",
|
| 34 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D, ZeroPadding2D\n",
|
| 35 |
-
"from tensorflow.keras.layers import BatchNormalization\n",
|
| 36 |
-
"from tensorflow.keras.regularizers import l2\n",
|
| 37 |
-
"from tensorflow.keras.utils import to_categorical\n",
|
| 38 |
-
"\n",
|
| 39 |
-
"# Loads the CIFAR dataset\n",
|
| 40 |
-
"(x_train, y_train), (x_test, y_test) = cifar10.load_data()\n",
|
| 41 |
-
"\n",
|
| 42 |
-
"# Display our data shape/dimensions\n",
|
| 43 |
-
"print('x_train shape:', x_train.shape)\n",
|
| 44 |
-
"print(x_train.shape[0], 'train samples')\n",
|
| 45 |
-
"print(x_test.shape[0], 'test samples')\n",
|
| 46 |
-
"\n",
|
| 47 |
-
"# Now we one hot encode outputs\n",
|
| 48 |
-
"num_classes = 10\n",
|
| 49 |
-
"y_train = to_categorical(y_train)\n",
|
| 50 |
-
"y_test = to_categorical(y_test)"
|
| 51 |
-
]
|
| 52 |
-
},
|
| 53 |
-
{
|
| 54 |
-
"cell_type": "markdown",
|
| 55 |
-
"metadata": {},
|
| 56 |
-
"source": [
|
| 57 |
-
"### Now let's create our layers to replicate AlexNet"
|
| 58 |
-
]
|
| 59 |
-
},
|
| 60 |
-
{
|
| 61 |
-
"cell_type": "code",
|
| 62 |
-
"execution_count": 3,
|
| 63 |
-
"metadata": {},
|
| 64 |
-
"outputs": [
|
| 65 |
-
{
|
| 66 |
-
"name": "stdout",
|
| 67 |
-
"output_type": "stream",
|
| 68 |
-
"text": [
|
| 69 |
-
"Model: \"sequential\"\n",
|
| 70 |
-
"_________________________________________________________________\n",
|
| 71 |
-
"Layer (type) Output Shape Param # \n",
|
| 72 |
-
"=================================================================\n",
|
| 73 |
-
"conv2d (Conv2D) (None, 32, 32, 96) 34944 \n",
|
| 74 |
-
"_________________________________________________________________\n",
|
| 75 |
-
"batch_normalization (BatchNo (None, 32, 32, 96) 384 \n",
|
| 76 |
-
"_________________________________________________________________\n",
|
| 77 |
-
"activation (Activation) (None, 32, 32, 96) 0 \n",
|
| 78 |
-
"_________________________________________________________________\n",
|
| 79 |
-
"max_pooling2d (MaxPooling2D) (None, 16, 16, 96) 0 \n",
|
| 80 |
-
"_________________________________________________________________\n",
|
| 81 |
-
"conv2d_1 (Conv2D) (None, 16, 16, 256) 614656 \n",
|
| 82 |
-
"_________________________________________________________________\n",
|
| 83 |
-
"batch_normalization_1 (Batch (None, 16, 16, 256) 1024 \n",
|
| 84 |
-
"_________________________________________________________________\n",
|
| 85 |
-
"activation_1 (Activation) (None, 16, 16, 256) 0 \n",
|
| 86 |
-
"_________________________________________________________________\n",
|
| 87 |
-
"max_pooling2d_1 (MaxPooling2 (None, 8, 8, 256) 0 \n",
|
| 88 |
-
"_________________________________________________________________\n",
|
| 89 |
-
"zero_padding2d (ZeroPadding2 (None, 10, 10, 256) 0 \n",
|
| 90 |
-
"_________________________________________________________________\n",
|
| 91 |
-
"conv2d_2 (Conv2D) (None, 10, 10, 512) 1180160 \n",
|
| 92 |
-
"_________________________________________________________________\n",
|
| 93 |
-
"batch_normalization_2 (Batch (None, 10, 10, 512) 2048 \n",
|
| 94 |
-
"_________________________________________________________________\n",
|
| 95 |
-
"activation_2 (Activation) (None, 10, 10, 512) 0 \n",
|
| 96 |
-
"_________________________________________________________________\n",
|
| 97 |
-
"max_pooling2d_2 (MaxPooling2 (None, 5, 5, 512) 0 \n",
|
| 98 |
-
"_________________________________________________________________\n",
|
| 99 |
-
"zero_padding2d_1 (ZeroPaddin (None, 7, 7, 512) 0 \n",
|
| 100 |
-
"_________________________________________________________________\n",
|
| 101 |
-
"conv2d_3 (Conv2D) (None, 7, 7, 1024) 4719616 \n",
|
| 102 |
-
"_________________________________________________________________\n",
|
| 103 |
-
"batch_normalization_3 (Batch (None, 7, 7, 1024) 4096 \n",
|
| 104 |
-
"_________________________________________________________________\n",
|
| 105 |
-
"activation_3 (Activation) (None, 7, 7, 1024) 0 \n",
|
| 106 |
-
"_________________________________________________________________\n",
|
| 107 |
-
"zero_padding2d_2 (ZeroPaddin (None, 9, 9, 1024) 0 \n",
|
| 108 |
-
"_________________________________________________________________\n",
|
| 109 |
-
"conv2d_4 (Conv2D) (None, 9, 9, 1024) 9438208 \n",
|
| 110 |
-
"_________________________________________________________________\n",
|
| 111 |
-
"batch_normalization_4 (Batch (None, 9, 9, 1024) 4096 \n",
|
| 112 |
-
"_________________________________________________________________\n",
|
| 113 |
-
"activation_4 (Activation) (None, 9, 9, 1024) 0 \n",
|
| 114 |
-
"_________________________________________________________________\n",
|
| 115 |
-
"max_pooling2d_3 (MaxPooling2 (None, 4, 4, 1024) 0 \n",
|
| 116 |
-
"_________________________________________________________________\n",
|
| 117 |
-
"flatten (Flatten) (None, 16384) 0 \n",
|
| 118 |
-
"_________________________________________________________________\n",
|
| 119 |
-
"dense (Dense) (None, 3072) 50334720 \n",
|
| 120 |
-
"_________________________________________________________________\n",
|
| 121 |
-
"batch_normalization_5 (Batch (None, 3072) 12288 \n",
|
| 122 |
-
"_________________________________________________________________\n",
|
| 123 |
-
"activation_5 (Activation) (None, 3072) 0 \n",
|
| 124 |
-
"_________________________________________________________________\n",
|
| 125 |
-
"dropout (Dropout) (None, 3072) 0 \n",
|
| 126 |
-
"_________________________________________________________________\n",
|
| 127 |
-
"dense_1 (Dense) (None, 4096) 12587008 \n",
|
| 128 |
-
"_________________________________________________________________\n",
|
| 129 |
-
"batch_normalization_6 (Batch (None, 4096) 16384 \n",
|
| 130 |
-
"_________________________________________________________________\n",
|
| 131 |
-
"activation_6 (Activation) (None, 4096) 0 \n",
|
| 132 |
-
"_________________________________________________________________\n",
|
| 133 |
-
"dropout_1 (Dropout) (None, 4096) 0 \n",
|
| 134 |
-
"_________________________________________________________________\n",
|
| 135 |
-
"dense_2 (Dense) (None, 10) 40970 \n",
|
| 136 |
-
"_________________________________________________________________\n",
|
| 137 |
-
"batch_normalization_7 (Batch (None, 10) 40 \n",
|
| 138 |
-
"_________________________________________________________________\n",
|
| 139 |
-
"activation_7 (Activation) (None, 10) 0 \n",
|
| 140 |
-
"=================================================================\n",
|
| 141 |
-
"Total params: 78,990,642\n",
|
| 142 |
-
"Trainable params: 78,970,462\n",
|
| 143 |
-
"Non-trainable params: 20,180\n",
|
| 144 |
-
"_________________________________________________________________\n",
|
| 145 |
-
"None\n"
|
| 146 |
-
]
|
| 147 |
-
}
|
| 148 |
-
],
|
| 149 |
-
"source": [
|
| 150 |
-
"l2_reg = 0\n",
|
| 151 |
-
"\n",
|
| 152 |
-
"# Initialize model\n",
|
| 153 |
-
"model = Sequential()\n",
|
| 154 |
-
"\n",
|
| 155 |
-
"# 1st Conv Layer \n",
|
| 156 |
-
"model.add(Conv2D(96, (11, 11), input_shape=x_train.shape[1:],\n",
|
| 157 |
-
" padding='same', kernel_regularizer=l2(l2_reg)))\n",
|
| 158 |
-
"model.add(BatchNormalization())\n",
|
| 159 |
-
"model.add(Activation('relu'))\n",
|
| 160 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 161 |
-
"\n",
|
| 162 |
-
"# 2nd Conv Layer \n",
|
| 163 |
-
"model.add(Conv2D(256, (5, 5), padding='same'))\n",
|
| 164 |
-
"model.add(BatchNormalization())\n",
|
| 165 |
-
"model.add(Activation('relu'))\n",
|
| 166 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 167 |
-
"\n",
|
| 168 |
-
"# 3rd Conv Layer \n",
|
| 169 |
-
"model.add(ZeroPadding2D((1, 1)))\n",
|
| 170 |
-
"model.add(Conv2D(512, (3, 3), padding='same'))\n",
|
| 171 |
-
"model.add(BatchNormalization())\n",
|
| 172 |
-
"model.add(Activation('relu'))\n",
|
| 173 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 174 |
-
"\n",
|
| 175 |
-
"# 4th Conv Layer \n",
|
| 176 |
-
"model.add(ZeroPadding2D((1, 1)))\n",
|
| 177 |
-
"model.add(Conv2D(1024, (3, 3), padding='same'))\n",
|
| 178 |
-
"model.add(BatchNormalization())\n",
|
| 179 |
-
"model.add(Activation('relu'))\n",
|
| 180 |
-
"\n",
|
| 181 |
-
"# 5th Conv Layer \n",
|
| 182 |
-
"model.add(ZeroPadding2D((1, 1)))\n",
|
| 183 |
-
"model.add(Conv2D(1024, (3, 3), padding='same'))\n",
|
| 184 |
-
"model.add(BatchNormalization())\n",
|
| 185 |
-
"model.add(Activation('relu'))\n",
|
| 186 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 187 |
-
"\n",
|
| 188 |
-
"# 1st FC Layer\n",
|
| 189 |
-
"model.add(Flatten())\n",
|
| 190 |
-
"model.add(Dense(3072))\n",
|
| 191 |
-
"model.add(BatchNormalization())\n",
|
| 192 |
-
"model.add(Activation('relu'))\n",
|
| 193 |
-
"model.add(Dropout(0.5))\n",
|
| 194 |
-
"\n",
|
| 195 |
-
"# 2nd FC Layer\n",
|
| 196 |
-
"model.add(Dense(4096))\n",
|
| 197 |
-
"model.add(BatchNormalization())\n",
|
| 198 |
-
"model.add(Activation('relu'))\n",
|
| 199 |
-
"model.add(Dropout(0.5))\n",
|
| 200 |
-
"\n",
|
| 201 |
-
"# 3rd FC Layer\n",
|
| 202 |
-
"model.add(Dense(num_classes))\n",
|
| 203 |
-
"model.add(BatchNormalization())\n",
|
| 204 |
-
"model.add(Activation('softmax'))\n",
|
| 205 |
-
"\n",
|
| 206 |
-
"print(model.summary())\n",
|
| 207 |
-
"\n",
|
| 208 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 209 |
-
" optimizer = tf.keras.optimizers.Adadelta(),\n",
|
| 210 |
-
" metrics = ['accuracy'])\n"
|
| 211 |
-
]
|
| 212 |
-
},
|
| 213 |
-
{
|
| 214 |
-
"cell_type": "markdown",
|
| 215 |
-
"metadata": {},
|
| 216 |
-
"source": [
|
| 217 |
-
"### Now let us train AlexNet on our CIFAR10 Dataset"
|
| 218 |
-
]
|
| 219 |
-
},
|
| 220 |
-
{
|
| 221 |
-
"cell_type": "code",
|
| 222 |
-
"execution_count": null,
|
| 223 |
-
"metadata": {},
|
| 224 |
-
"outputs": [],
|
| 225 |
-
"source": [
|
| 226 |
-
"# Training Parameters\n",
|
| 227 |
-
"batch_size = 32\n",
|
| 228 |
-
"epochs = 1\n",
|
| 229 |
-
"\n",
|
| 230 |
-
"history = model.fit(x_train, y_train,\n",
|
| 231 |
-
" batch_size=batch_size,\n",
|
| 232 |
-
" epochs=epochs,\n",
|
| 233 |
-
" validation_data=(x_test, y_test),\n",
|
| 234 |
-
" shuffle=True)\n",
|
| 235 |
-
"\n",
|
| 236 |
-
"model.save(\"CIFAR10_AlexNet_1_Epoch.h5\")\n",
|
| 237 |
-
"\n",
|
| 238 |
-
"# Evaluate the performance of our trained model\n",
|
| 239 |
-
"scores = model.evaluate(x_test, y_test, verbose=1)\n",
|
| 240 |
-
"print('Test loss:', scores[0])\n",
|
| 241 |
-
"print('Test accuracy:', scores[1])"
|
| 242 |
-
]
|
| 243 |
-
}
|
| 244 |
-
],
|
| 245 |
-
"metadata": {
|
| 246 |
-
"kernelspec": {
|
| 247 |
-
"display_name": "Python 3",
|
| 248 |
-
"language": "python",
|
| 249 |
-
"name": "python3"
|
| 250 |
-
},
|
| 251 |
-
"language_info": {
|
| 252 |
-
"codemirror_mode": {
|
| 253 |
-
"name": "ipython",
|
| 254 |
-
"version": 3
|
| 255 |
-
},
|
| 256 |
-
"file_extension": ".py",
|
| 257 |
-
"mimetype": "text/x-python",
|
| 258 |
-
"name": "python",
|
| 259 |
-
"nbconvert_exporter": "python",
|
| 260 |
-
"pygments_lexer": "ipython3",
|
| 261 |
-
"version": "3.7.4"
|
| 262 |
-
}
|
| 263 |
-
},
|
| 264 |
-
"nbformat": 4,
|
| 265 |
-
"nbformat_minor": 2
|
| 266 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:337403d0cc0c081c71fa23577f81d544101f9a61090bd40c32dfece66af5aa71
|
| 3 |
+
size 11012
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13. Building LeNet and AlexNet in Keras/13.4 Fashion MNIST.ipynb
CHANGED
|
@@ -1,445 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Fashion MNIST"
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "markdown",
|
| 12 |
-
"metadata": {},
|
| 13 |
-
"source": [
|
| 14 |
-
"### After downloading our dataset we see it's coded in the ubyte form\n",
|
| 15 |
-
"- We then use the following function to read the data and return it as a numpy array"
|
| 16 |
-
]
|
| 17 |
-
},
|
| 18 |
-
{
|
| 19 |
-
"cell_type": "code",
|
| 20 |
-
"execution_count": 1,
|
| 21 |
-
"metadata": {},
|
| 22 |
-
"outputs": [],
|
| 23 |
-
"source": [
|
| 24 |
-
"import struct\n",
|
| 25 |
-
"import numpy as np\n",
|
| 26 |
-
"\n",
|
| 27 |
-
"def read_idx(filename):\n",
|
| 28 |
-
" \"\"\"Credit: https://gist.github.com/tylerneylon\"\"\"\n",
|
| 29 |
-
" with open(filename, 'rb') as f:\n",
|
| 30 |
-
" zero, data_type, dims = struct.unpack('>HBB', f.read(4))\n",
|
| 31 |
-
" shape = tuple(struct.unpack('>I', f.read(4))[0] for d in range(dims))\n",
|
| 32 |
-
" return np.frombuffer(f.read(), dtype=np.uint8).reshape(shape)"
|
| 33 |
-
]
|
| 34 |
-
},
|
| 35 |
-
{
|
| 36 |
-
"cell_type": "markdown",
|
| 37 |
-
"metadata": {},
|
| 38 |
-
"source": [
|
| 39 |
-
"### We use the function to extact our training and test datasets"
|
| 40 |
-
]
|
| 41 |
-
},
|
| 42 |
-
{
|
| 43 |
-
"cell_type": "code",
|
| 44 |
-
"execution_count": 2,
|
| 45 |
-
"metadata": {},
|
| 46 |
-
"outputs": [],
|
| 47 |
-
"source": [
|
| 48 |
-
"x_train = read_idx(\"./fashion_mnist/train-images-idx3-ubyte\")\n",
|
| 49 |
-
"y_train = read_idx(\"./fashion_mnist/train-labels-idx1-ubyte\")\n",
|
| 50 |
-
"x_test = read_idx(\"./fashion_mnist/t10k-images-idx3-ubyte\")\n",
|
| 51 |
-
"y_test = read_idx(\"./fashion_mnist/t10k-labels-idx1-ubyte\")"
|
| 52 |
-
]
|
| 53 |
-
},
|
| 54 |
-
{
|
| 55 |
-
"cell_type": "markdown",
|
| 56 |
-
"metadata": {},
|
| 57 |
-
"source": [
|
| 58 |
-
"### Let's inspect our dataset"
|
| 59 |
-
]
|
| 60 |
-
},
|
| 61 |
-
{
|
| 62 |
-
"cell_type": "code",
|
| 63 |
-
"execution_count": 3,
|
| 64 |
-
"metadata": {},
|
| 65 |
-
"outputs": [
|
| 66 |
-
{
|
| 67 |
-
"name": "stdout",
|
| 68 |
-
"output_type": "stream",
|
| 69 |
-
"text": [
|
| 70 |
-
"Initial shape or dimensions of x_train (60000, 28, 28)\n",
|
| 71 |
-
"Number of samples in our training data: 60000\n",
|
| 72 |
-
"Number of labels in our training data: 60000\n",
|
| 73 |
-
"Number of samples in our test data: 10000\n",
|
| 74 |
-
"Number of labels in our test data: 10000\n",
|
| 75 |
-
"\n",
|
| 76 |
-
"Dimensions of x_train:(28, 28)\n",
|
| 77 |
-
"Labels in x_train:(60000,)\n",
|
| 78 |
-
"\n",
|
| 79 |
-
"Dimensions of x_test:(28, 28)\n",
|
| 80 |
-
"Labels in y_test:(10000,)\n"
|
| 81 |
-
]
|
| 82 |
-
}
|
| 83 |
-
],
|
| 84 |
-
"source": [
|
| 85 |
-
"# printing the number of samples in x_train, x_test, y_train, y_test\n",
|
| 86 |
-
"print(\"Initial shape or dimensions of x_train\", str(x_train.shape))\n",
|
| 87 |
-
"\n",
|
| 88 |
-
"print (\"Number of samples in our training data: \" + str(len(x_train)))\n",
|
| 89 |
-
"print (\"Number of labels in our training data: \" + str(len(y_train)))\n",
|
| 90 |
-
"print (\"Number of samples in our test data: \" + str(len(x_test)))\n",
|
| 91 |
-
"print (\"Number of labels in our test data: \" + str(len(y_test)))\n",
|
| 92 |
-
"print()\n",
|
| 93 |
-
"print (\"Dimensions of x_train:\" + str(x_train[0].shape))\n",
|
| 94 |
-
"print (\"Labels in x_train:\" + str(y_train.shape))\n",
|
| 95 |
-
"print()\n",
|
| 96 |
-
"print (\"Dimensions of x_test:\" + str(x_test[0].shape))\n",
|
| 97 |
-
"print (\"Labels in y_test:\" + str(y_test.shape))"
|
| 98 |
-
]
|
| 99 |
-
},
|
| 100 |
-
{
|
| 101 |
-
"cell_type": "markdown",
|
| 102 |
-
"metadata": {},
|
| 103 |
-
"source": [
|
| 104 |
-
"### Let's view some sample images"
|
| 105 |
-
]
|
| 106 |
-
},
|
| 107 |
-
{
|
| 108 |
-
"cell_type": "code",
|
| 109 |
-
"execution_count": 5,
|
| 110 |
-
"metadata": {},
|
| 111 |
-
"outputs": [
|
| 112 |
-
{
|
| 113 |
-
"data": {
|
| 114 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAU4AAACuCAYAAABZYORfAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO2de7CVVf3/XwtTTCEFEUQEASUVrxiJ4j0HhW/e706Dfk2zKS0rHTOdkV+3+aUzlY6/asIySQlNM6HyRgqpgyFKeEHkGuIxFFRUtJLM9ftj7/ez1t7nOefs5+zL2c8+n9fMmbPPs9bezzr7s/fzfG7r83HeewzDMIzK6dPTCzAMw8gbduE0DMPIiF04DcMwMmIXTsMwjIzYhdMwDCMjduE0DMPISFUXTufcZOfccufcKufc1bValNGzmFxbF5NtbXDdzeN0zm0FrAAmAW3AIuA87/2LtVue0WhMrq2LybZ2fKyK5x4CrPLerwFwzt0JnAJ0KATnXI9k22+99dYA7LDDDgD885//TMbixw3gDe/9zo08YTfIjVybiDzIFTLKtlZyHTFiRPL4vffeA2CbbbYB4KOPPgJgy5YtyZw+ffqU/I7Httpqq5LfmtO3b99kjl579erV1S69Q7lWc+EcBrwS/d0GTCif5Jy7BLikivNkQm/of//73+TYkCFDAJgyZQoAzz77bDL21FNPAUEAEmSdeLmeL14jmlKuTU4e5AoVyLYecv3Wt76VPF6wYAEAu+66KxAuimvXrk3m9OvXDwgXw1dffTUZ+8QnPgFA//79S36PHj06maPXPuOMM6pdeodyrebC6VKOtbtDee+nA9OhvpqJc4Xl6IK5yy67JGOHH344ALfccgsA3/72t5OxDRs2AEFwDbqANjNNJVejpnQp23rIddSoUcnj4447DgiWnrTRDz/8MJmj794HH3wAwLx585IxXUQHDhwIwODBg9udb86cObVYdqdUExxqA4ZHf+8G/KO65RhNgMm1dTHZ1ohqLpyLgDHOuVHOuW2Ac4H6X+qNemNybV1MtjWi26a69/5D59xlwEPAVsCt3vulNVtZ9vWU/H3uuecmj2+88caSsWnTpiWPr7rqKgBuuOEGoFeb6EDzydWoHY2W7RFHHAHAdtttlxxra2sD4P333y/5HQdyFNxZuXIlAB//+MeTsRdfLMSxXn654H48/fTTAdh+++2TORMnTgTg6aefBmDp0tr/i9X4OPHe3w/cX6O1GE2CybV1MdnWhqounM1AeRT92GOPBUoj50J3Jd3lINzVDjnkECBE2T/2sfDWxI5rwzAqY/ny5QC89tprybEJEwpB/N122w0IgZ84ZUmR9sMOO6xkLoTv8P33F679jz/+OFD6fV+xYgUA//hH/dy3tuXSMAwjI7nUOJV6BKX5mhDSkGbNmtXuef/617/aHfv9738PwJe+9CUgaJymZRpGdSif8t///ndyTN9L+ShPO+00AJYsWdLu+crjjP2fsgQPOOAAAPbff38AfvrTnyZz3nzzzZK59cA0TsMwjIzkQuOUhqnIuZLUIWicJ554IgDPPfdcu+eX+0Hj7VlKspWGqUjgE088kczRncu0UMOonLFjxwJhAwrAxRdfDMDChQsBuP766wF4/fXXkzn6nv/nP/8BSjNdBg0aBMBFF10EwPnnnw+EbdUAO+20ExAS7zdu3FiT/yfGNE7DMIyM2IXTMAwjI7kw1aW6l+9Hh5Acqzlpya7lASSZADEPPfQQEIJEsale/nzDMLpGKUZvvfVWcizedw4hcBR/x8orJ8UoAf7tt98uOT5jxozk8ZlnngnAu+++2+21d4VpnIZhGBnJhcYpyoNEAGeddRYAjz76aOrc8vlQeifT3W3dunVAcFKPHz8+maOtWxYkai60lU+Vdi699NJk7KijjgJg/vz5APzsZz+ryTnjwGRv357bETvvXChhOXLkSCC9LqZq4ypFUN9DCBtU9P7GWy7FG2+8UfL3zJkzk8cXXnghAFOnTgXg2muvzf5PdIFpnIZhGBnJhcZZXiNz7733TsY2b94MhOIBmltpS5By7fHee+8F4LOf/WxyTBqnaRjNhVLJRLztTtqOCj6ccMIJQGnBCaWpKG1l06ZNydiPf/xjAP785z+XPK/BHQNyiVL6pE0OGDCg3RzJRzKMq7wLWXhxTEKFjOOiHgDbbrtt8ljf0x133BGAo48+Ohn7y1/+kuE/6RjTOA3DMDJiF07DMIyMdGmqO+duBU4ENnjv9yseGwjcBYwE1gJne+83dfQa1VJuIk+ePDl5XF5rM6s5rfky8RUkik2BPffcE4BVq1Zleu1mphnkWi3laWJxCln8GODUU08FYMyYMckxyVOVsYYNG5aMKcAgUz0tha1Z6WnZqv6DfscBNaEqSe+88w6QnvKn3UDa1w7w0ksvAaHXkIh3B6mqUj2pROO8DZhcduxq4BHv/RjgkeLfRr64DZNrq3IbJtu6UlFfdefcSOCP0d1rOXCM9369c24oMN97v1cFr1NV8yc5nWNHsqoZ1Ro1gwI455xzgJqktDzjvR/f9bTG0CxyrRWqSRA/Tgs6VILquip5W4GneE90pIU2lVyhNrKtp1xnz54NhOBQHOhTcrsCPnFdzeeffx5oXwVNFZFqTIdy7W5UfYj3fj1AURDtW80VsTayucLk2rpUJFuTa2XUPR2plu1Ghw8vNOhLq7VZa+JtYq+8UmhFLS00Huut1FKu2qwQb1qQX0y+r0oso9hPVr5NN63tc/k5YuTjlPbz1a9+FWj9zQ+1lGtn77lqdS5btqzd88q3XMbxBvmhlUAvX2escTaixXd3o+qvF9V9ir831G5JRg9icm1dTLY1pLsa5xzgAuAHxd+za7aiFHSXue+++wCYMmVKMnbJJQWrYu3atQBs2FD4PMRRUPmllCwf+1OUUCukfcTHtQVMYw888EA1/04zUze5Kpk51hylDUqLi8fKtYW07badUf78NK2y/Ng999yTPD7ooIOA9j70tFqwOaGh31lI1/jKt2FKrrHvWM/T+5umcQ4dOhSA3XffHQjf/47OW2u61Didc7OAJ4G9nHNtzrmLKLz5k5xzK4FJxb+NHGFybV1MtvWnS43Te39eB0PH1XgtRgMxubYuJtv6k4u96pdddhkAo0ePBoI5DkFFV+BGqnuchiJzQOZivN9YZrtMMKU5xOa85itpt4VN9bpRSVAlrkGgdCClgMlEj1OOZJJVar53xM033wzAgQcemBxTUOimm26q6rV7M2lBGgVz1L4mLV1MJrrG4u+iXlNjcevgRmJbLg3DMDKSC41TgRo1f9I2LQjN7nVMFVGUrgAhvSGtso3uZnJOK9k21pB0l1SQyEinsxqoeg/jSjlq4iW5xg27rrjiCiAE9O644452r1utpqkgo7bwxut/+OGHgdBUTOQsINSjdKZxyvrTWKx5xlYFlAaOhD4Xgwd3mGpcV0zjNAzDyEguNE4lvKuydIx8JdIWpHnGdzBVkNbzY62hfP57770HBF8nhBSIH/7wh9X+Ky1NrBno/bzgggsAuPrqwtboG264IZmjwgxq+bpmzZpkbP369QB88YtfBILGWctUk6uuugoI1kVcKOKaa64pmduIpOreQHlRHX0X0wqBSCuN0TZM9SqKt0Y3EtM4DcMwMmIXTsMwjIzkwlTfd999gdCgKQ4KaN+4nMQy3dMCB6q1GQd+5IjWsbRy/WqtUKcKLC1Bnz59SoIrYsiQIUB4X1euXJmMHX/88UCQQVxHccWKFUBoxtfdmqh6bZmEcUM3uQgk+7TqV+WtWKxZW+WkvT9xYDcmrU6BXD+xa03HZKr3FKZxGoZhZCQXGucBBxwAhMCN9qkC7LTTTkDQEHV3ilMaNKZUpTihVqlKusvpjhg39ZKmWs8G93nno48+atc8DWD69OkAfOUrXwFKtX2lHymJ+fHHH0/Gxo0bBwQN48QTTwRKK/5rrLPq7OXpQ6p6BOFzoNSWtKpb5VpTtSlQvR2lD44YMQIIVlxaMLezTROyYOJanY3ENE7DMIyM5ELjlF/q5z//OVCqYejuJN+mSKvNqN/SMiHc6XRMc9KSbvPUd6aR9O/fn/Hjx5dsWVy8eDEAjz32GBBqmqr9MrS3IA4++OBkTHKQhrJ8+fJ2580ij5NOOgkIn5eYs88+u+LXMapDGqZkX157E4L1F39PheZJjtoa22hM4zQMw8hILjROoQrwcZEPNb2XT1N3qzhK15mfqnzLV3mjewi+lvKtYEaBLVu2sG7duqSTJAQN74UXXgDC+xonNUujkLUQa6zKltD8733vewAcddRRyRwVdJE84xqqqjCuz4Pqa8YairSeRx99tN3/pO258nWr18348aEFzXXXXQfALbfc0u75RudZB/oOS/bxXFl7aUnx8kvr8yRLJqYpKsA754Y75+Y555Y555Y65y4vHh/onJvrnFtZ/D2gq9cymgeTa2ticm0MlZjqHwJXeO/3AQ4FLnXOjcXajeYdk2trYnJtAJUUMl4PqDveZufcMmAYcApwTHHaDGA+8M26rLKIzLY4TUF7V8v3tcZqukzsNFNbz1dKiqq3xGafTMq0gFFeqaVcP/jgA1avXs3ll1/ebmyPPfYA4Ne//jVQGsRTBSvJIJaPTGSZZJqjpHWAJUuWAEF2mzZtSsZkhsuVIxM9/pwoxUnriFNiFHjS82W633777cmcNBO/p2mm72tnKLVP9SPiVLa0PeqiPCleQchGk8nHWezVPA5YiLUbbRlMrq2JybV+VHzhdM71A34HfM17/27a9ro0atluVBVz0pLbhe5I8RxpOToW39G0dUsaitIc4tqd0nZaSeMU9ZarmnI1I+W1NluJZvi+doY2PyjgG6eJ6fsqrTK2HvXdVQAorcZuI6goHck5tzUFIcz03isRz9qN5hyTa2ticq0/XWqcrnCr+iWwzHv/o2io4e1GlXYS+0PK28bK/5mWHC1fZVold/nU9NqxL06aqjTPVqCZ5GrUjmaUa1phFKURHXnkkUD69sq0tCIde/nll+uz2AqpxFQ/HJgKPO+cW1I8dg0FAfy22Hp0HXBWfZZo1AmTa2ticm0AlUTVnwA6cpBYu9GcYnJtTUyujSFXO4dkMiuQA8Eklxkt0z0ODimlRekmac7m8j3qcXUkjaXtKjIMIzvPPfccAOeffz6QXkNAbrMsdT07ml9rbK+6YRhGRnKhcUqLVHAnTifSnUpjSliOtcNyJ3N8d5PGWp7k3tbWlsxR47a0u6JhGNnR5gVZhvH3tTywq+90TNxKuicwjdMwDCMjudA4VbldmmeciC4/iOr8KSFWvW4APve5z5U8P+5Xcs899wAh2VZpEfE50qq0GIbRfbQ9Nq3mprRQpQ3Gc/TdTdNCG4ldEQzDMDKSC41TRRjko4wj3uXaqOowxsydOxcIhTviiLv8l9IwVQwirtuosX79+lX7rxiGEaHvdlrXSmml8Zh8oT0dbzCN0zAMIyN24TQMw8hILkz1tWvXAqGG38aNG5MxmeiDBxeqZKnFQZwEqwR2mfHxXneZ33JAp+1LVwtSrcMwjMrpLCH9j3/8IxD2rEOoEzFq1CgAnnrqqWRs0aJFAMycObPm68yCaZyGYRgZcXHjsrqfzLmNwPvAGw07ae0YRPXr3t17v3MtFtNMmFxNrk1IXeXa0AsngHPuae/9+K5nNhd5XXejyOv7k9d1N4q8vj/1XreZ6oZhGBmxC6dhGEZGeuLCOb0HzlkL8rruRpHX9yev624UeX1/6rruhvs4DcMw8o6Z6oZhGBmxC6dhGEZGGnbhdM5Nds4td86tcs5d3ajzZsU5N9w5N885t8w5t9Q5d3nx+EDn3Fzn3Mri7wE9vdZmIQ+yNblmx+TayXkb4eN0zm0FrAAmAW3AIuA87/2LdT95Roo9p4d67xc75/oDzwCnAv8LvOW9/0HxQzTAe//NHlxqU5AX2Zpcs2Fy7ZxGaZyHAKu892u891uAO4FTGnTuTHjv13vvFxcfbwaWAcMorHdGcdoMCsIxciJbk2tmTK6dUNWFM4MqPwx4Jfq7rXisqXHOjQTGAQuBId779VAQFjC451ZWXzKaaLmTbW+VK7T2d7aRcu32hbOoyv8EmAKMBc5zzo3taHrKsabOg3LO9QN+B3zNe/9uT6+nUWSUK+RMtr1VrtDa39lGy7XbPk7n3GHA//Hen1D8+1sA3vv/29Fc4Phur7QLyiu5x1XeVdZK/2vcQ0jH1GtIJeTqxBvNXgwii1yj+QvqtZ64oykEOVWKPg/lnRNrTNPLFbr1na2bXHNCh3Ktph5nmio/oXySc+4S4BJg/yrO1SUXXnghEGpuqk0GhKZPasgW19rUMbUbnTZtWj2X+XI9X7xGZJVrXRkwYIDOB8CGDRsqep5unKrT+uqrr3Y4V69dRaA0D3KFCmTbKLnmhA7lWs2FsyJV3ns/HZjunPsf4E9VnC8VaRRjxxYsDl0UpYFC6HypsfjCuXr16pLXUy+Tnu6i14NkkiuAc64mJlxsJUydOhWAk08+GQjFqx988MFkjiwJySruTKob4DvvvAPAo48+CsBdd92VzFmzZo3+l1osPw90Kdt6yLUVqSY41AYMj/7eDejQzvXe31/FuYzGkUmuRq4w2daIai6ci4AxzrlRzrltgHOBObVZltGDmFxbF5Ntjei2qe69/9A5dxnwELAVcKv3fmnNVlYhEyYUXDQ771zw4b722msAvPXWW8kc+bAUaNiyZUsyJtN86NChAIwePRqApUsb/q80BT0h16OPPhqAiy++ODmmFtBqDSsfdMynPvUpIJjjscyXL18OhKDQbrvtBsDdd9+dzLnvvvsA+O53v9vutWvg92w6muU72wpU1aytaH6bCd5imFxbF5NtbchFl8vO6N+/PwCDBg0qOS6NBULwQBpJHDiSxrnjjjsCobNeb9U464UCP8piABg4cCAAV155JQDr169PxtR9VPKVPN9+++1kjrTKcePGAaVa6bPPPltyvjFjxgAhIARw1llnAfCrX/0KgLa2tu7+e0Yvw6ojGYZhZCT3Gme5D0wpKnHCs3L51Ds9TqJWmov8nuUJ10ZtSOutLX+y0oik/UOQleQorTL2T69cuRKAY445BoDtt98+Gdu8eXPJ8/faa6+S4wAvv1xI05P/0zROo1JM4zQMw8hI7tUraRDSHuQ3i5PctY1SPk75zQDWrVtXckzJ8kb9kV9Zsttnn32SMWmoH3zwAQDz588HSjXG008/HQi7guTzBJg8eTIAw4YV6lLIzx0/X0ib/etf/1rNv2P0IkzjNAzDyIhdOA3DMDKSe1NdgR4V8ujbty8AQ4YMaTdHgZ933w1Vp2S6yTSM02WM2pGWSD5x4kQAnnvuOSAEcCBsSHjyyScB+OQnPwmUpiOp4Idk/8YbbyRjCvqNHDkSCIGjl156KZmzYsUKoNSt09l6DUOYxmkYhpGR3Guc0gyUliRtMg4AKTCwadMmIGzPjInTXIzGILkceuihQKm2L1nJWpAGefzxoaTrnDmFbdYHHXQQADvttFMypkCg0tSUchSfQ9t107ZzqmZrWhqV0X0qeV9lNSowGDN+/HggbGLRFmulpEEIDCro+MILLyRjChbqHNdffz1QmopWyRpN4zQMw8hI7jVOJU3LfymNQr4tgBkzCj2btI1v//1DTWUlSCsJO+0uZ9QHaQLyTWoLJAQNUXf//fbbD4DZs2cnc6Qpylep7bIQZC3NQlZGvJVWRUXirZ7CNM36UP6+pml3nX0HJdfDDjsMCN9b+cQhxDdeeaVQszn2i0uz1Dm6W3fXNE7DMIyM2IXTMAwjI12a6s65W4ETgQ3e+/2KxwYCdwEjgbXA2d77TfVbZse89957QPumXDLRIASM1IdIO1YgmPhKSdHrtTrNINcDDjgACOZamtkkE/vvf/87ULqfXUEl7Vm/+eabkzG5alQ165lnngFKdw7ddNNNQNgX3yo0g2y7orMAzKWXXgqEFLRVq1YlY6ecUmjtLlNb39c777wzmaPvsj4zqn5VKZW4aSrROG8DJpcduxp4xHs/Bnik+LeRL27D5Nqq3IbJtq50qXF67x8rNnqPOQU4pvh4BjAf+GYN11UxSkMSSl+JNU6lpqhSeNzUS+lMqhcZO5JbmWaQqywAaZ6qlgQhHUnyVT3NtWvXJnOkdeh1tHcdQhBIY0pZmjVrVjJHmolSnKSB5p1mkG1HyMLT91RBwOuuuy6Z89BDDwHwpz8VejuqngSUWhUdoeCvUpcq4ZBDDkkey0KNN0uU092o+hDv/XoA7/1659zgjiZau9FcYXJtXSqSrcm1MuqejlTvdqNKZVECu3wecUqD7ljSLmMfp3yi+p2WmmK0p7ty3WGHHZLHe++9NwDPP/98u3naIimZLViwAAiaKARfmKq9xzIfMWIEEBKl9TmJU9FUNUufh3333TcZ660dAOr9fS33Hx555JEA/OY3v0mOqRJWGtJYZSHqexu/7j333FMylsaBBx4IwKc//WkAzjjjjGTsxhtvBDrXOLsbVX/dOTcUoPh7Qzdfx2guTK6ti8m2hnRX45wDXAD8oPh7dufT64d8koqWSvOMt1DKt5lWuEEJtUqc7+UJ8HWXq3xaEGqf7rnnngDsscceyZg0f1kLmnPHHXe0m6N6nEp4huDblK9bGok6Y0IoDiIUxYWW1DhrKtuKtiX26Vovk8b59a9/vaJzyDdaXrgnnqPtuY899hgAY8eOTcYUldfnQlbH/feH/nXysXZGl/+Zc24W8CSwl3OuzTl3EYU3f5JzbiUwqfi3kSNMrq2Lybb+VBJVP6+DoeNqvBajgZhcWxeTbf3J/V51obQVtZGNVXclVkv1T6u52VvSkHqak046KXksc1r1ONX2BEJKiII6kufVV4f0QwV1FPCJA08KOKmdhtJNVDEHQuK7KuzEFbWM6qkkkVzv+dSpU5Njv/jFLyp+vkz2XXfdNTmmIN+kSZOAEISE4LbTpgl9HvQZqBTbcmkYhpGRltE45ehXsCdOcnfOAeEOFifHS0O1NKTGoG2SELRBWQLSBiCkkihtRBZFnAy9ePFiIAQIVIcRYPXq1UCQuQKDb775ZjJn+PDhQNA8464BRoHyAI/ez4q2JVYQQFKDvBNOOCE5Jo2zkteWtXH22WcnY9I+ZWl++ctfTsZUHUnnO+200wCYNm1al+csOX+m2YZhGEbraJxKmBbSJCHcnaRVpmmc1mOmMSgxHYJ1IC1AGiCEBOcXX3wRCP7PuAWwtmiqLufMmTOTMaWnSZscMGAAUJqmdvvttwNBI4m1YaNANXVJK3nuH/7wBwBOPvnk5JjS0mQ1xKjoizRM+TGXLVuWzPnCF74ABP9nGkphU0vqrDEO0zgNwzAyYhdOwzCMjLSMqS6TWykqcXBIuwNE7PDW8+I6j0b9UEAHwi4t7SCKAzcyyeRWkaket0RRczYFjAYPDnUrFi5cCMCUKVOA4A6QiQZw9NFHA8HUj9sTG9mIXWWqDyDz+xvf+EYydvDBBwNhd49M5Hh3z3HHFdJNZarHLpQzzzwTCClHV1xxBRBcOjFKd9PnC4L5rnQ3fU6y7hg0jdMwDCMjLaNxKtAgzSSuolNOXAVcGmqcPG3UniOOOAIofZ+lKUq7lKYCoeq3EtllEcStfHfffXcgBAjiwI/krwCSnq/UNAifA2kvqhgOoX7nkiVLsv2jLUafPn1KLDQFfMqbG1522WXJHCWVq/J6vLHg+9//PgCXXFKoXCfNL65hoEZsDz74IACf//znkzEF+WRJCKWkxahSWmd75rtraZrGaRiGkZGW0TjlY5EGGWsm5cTJ7rob6k5m1AdVJZJlACGpXSlGw4YNS8a0TU6tf/W82JclLUN+T1X6h6AZqcq7/KexZiPNSNs6Y81IWk9v1zg/+uij1LSicp9g3Nr57rvvBkJK2Xe+851kLPZlxmguwMSJE4GgacZWSnweCD7weD1pFZPKkRYax0KyYBqnYRhGRlpG45SfTJHW2N9VTpwYq8ib6j0a9UH9fO67777k2Gc+8xkgRE3/9re/JWPqFyMt8sknnyx5DoREaVkLcXL8mjVrgODrVGQ11kKkqcq3qcrfEHxvvZltt92W0aNHJz2ZIFhykovqlko+EN5rae0q4gJw7LHHAiFyrsrv8ldD+Dzo+eecc067tck3mdYZtbzvWFrEXM+LMzmyUEk9zuHOuXnOuWXOuaXOucuLxwc65+Y651YWf5utmyNMrq2JybUxVGKqfwhc4b3fBzgUuNQ5NxZrN5p3TK6ticm1Abise7Sdc7OB/1f8OabYMW8oMN9732kGcT2aP4m5c+cCwWSPnc1xgy4IAQcIAQpV5lEp/zrxjPe+8p6lDaQZ5Bqbx5Kn2gGrrUU8R6lGCu7EJuWVV14JhNbDSnaPzfm4NmiVtKRc+/bt63fZZZeS90nBFLnEFJSN3V8Kzui9j81wmfQK0j399NNAaQNFVbk6//zzgdLmbTK/05q0ifKqTGnpVHLzPPzwwyV/l9GhXDP5OIu9mscBC7F2oy2DybU1qVausfJhlFLxhdM51w/4HfA17/27cSJxZ9S73ajQXUXpJ5059+O16w7WW6sjNVqucaKytBR9QV944YVkTPVVFbRTsG/UqFHJHAUm9JpPPPFEMrbffvsBQTNSEEBaEIQUp7RE6WqqAjUDtZLrunXr+MlPfpLp3ArcqNpVnGYmOSooO2HCBKA0uKTkeBHLpZKtkeWyS5OltnNee+21Xb5eGhWlIznntqYghJne+3uLh63daM4xubYmJtf606XG6Qq3ql8Cy7z3P4qGmqZFMIQ7kXwusS+rnDgFQYUhpHX0FnpKrml3fyW7x1aC5Kf0I22lfemll5I50lqkcS5atCgZU6tgaZyyLOJtldpqqUIPlWplzUwzfF+V6qNYQhxTKOeBBx7o8vXqqf3feeed3XpeJab64cBU4HnnnLZRXENBAL8tth5dB5zVwfON5sTk2pqYXBtAJe2BnwA6uhVbu9GcYnJtTUyujaFldg4pJUWOaQUO0ojL7CtVors7CIxspAXhlBKmgBCEakqq3ymzPA7uqKWrgg9qgwAhzUVmvIKGcaBhzJgxQHwVIDwAAAO5SURBVDDVe2uA0MiO7VU3DMPISMtonHJIS6PoLB1JVXEgNPOyvcmNIU5HUhJznLgupH0qfUUJ07EloU0LqrjzyCOPJGM6psCEan1Kc4XQeviOO+4ATOM0Ksc0TsMwjIy0jMapGptKKZE2ksbKlSuTx9JQ46rwRv1IS/nRNlelDkHoE6WtcNp6Gfe2UWqR+gjFVoPSzPQ5kEUSz5GPU8T+T2mfeU+EN+qDaZyGYRgZaRmNUxqGigXERQPKkW8Ngs9t0KBBdVydIeL3Xlx88cVAaTEWVf1W3UbJN36+OhzK/xnLUAnw0iJVe/Ooo45K5sQV48G0S6NyTOM0DMPIiF04DcMwMtIyprpSSpR2ooZRacT72BV0ePbZZ+u3OCMhLeVHrRaUiB6z3XbbAXDyyScDsHHjxmRs3rx5AMyaNQuABQsWJGNKcdI++Ouvvx4IGyUqXZthpGEap2EYRkYyV4Cv6mTObQTeBzq+7Tcvg6h+3bt773euxWKaCZOrybUJqatcG3rhBHDOPd2sbQY6I6/rbhR5fX/yuu5Gkdf3p97rNlPdMAwjI3bhNAzDyEhPXDin98A5a0Fe190o8vr+5HXdjSKv709d191wH6dhGEbeMVPdMAwjI3bhNAzDyEjDLpzOucnOueXOuVXOuasbdd6sOOeGO+fmOeeWOeeWOucuLx4f6Jyb65xbWfw9oKfX2izkQbYm1+yYXDs5byN8nM65rYAVwCSgDVgEnOe9f7HuJ89Isef0UO/9Yudcf+AZ4FTgf4G3vPc/KH6IBnjvv9mDS20K8iJbk2s2TK6d0yiN8xBglfd+jfd+C3AncEqDzp0J7/167/3i4uPNwDJgGIX1zihOm0FBOEZOZGtyzYzJtRMadeEcBrwS/d1WPNbUOOdGAuOAhcAQ7/16KAgLGNxzK2sqcidbk2tFmFw7oVEXzrQ+z02dB+Wc6wf8Dvia9946uXVMrmRrcq0Yk2snNOrC2QYMj/7eDfhHg86dGefc1hSEMNN7f2/x8OtFf4r8Kht6an1NRm5ka3LNhMm1Exp14VwEjHHOjXLObQOcC8xp0Lkz4QrdxH4JLPPe/ygamgNcUHx8ATC70WtrUnIhW5NrZkyunZ23UTuHnHP/A9wIbAXc6r3/fkNOnBHn3BHA48DzgJrQXEPBb/JbYASwDjjLe/9W6ov0MvIgW5NrdkyunZzXtlwahmFkw3YOGYZhZMQunIZhGBmxC6dhGEZG7MJpGIaREbtwGoZhZMQunIZhGBmxC6dhGEZG/j/Y4FGAF4Wd6QAAAABJRU5ErkJggg==\n",
|
| 115 |
-
"text/plain": [
|
| 116 |
-
"<Figure size 432x288 with 6 Axes>"
|
| 117 |
-
]
|
| 118 |
-
},
|
| 119 |
-
"metadata": {
|
| 120 |
-
"needs_background": "light"
|
| 121 |
-
},
|
| 122 |
-
"output_type": "display_data"
|
| 123 |
-
}
|
| 124 |
-
],
|
| 125 |
-
"source": [
|
| 126 |
-
"# Let's do the same thing but using matplotlib to plot 6 images \n",
|
| 127 |
-
"import matplotlib.pyplot as plt\n",
|
| 128 |
-
"\n",
|
| 129 |
-
"# Plots 6 images, note subplot's arugments are nrows,ncols,index\n",
|
| 130 |
-
"# we set the color map to grey since our image dataset is grayscale\n",
|
| 131 |
-
"plt.subplot(331)\n",
|
| 132 |
-
"random_num = np.random.randint(0,len(x_train))\n",
|
| 133 |
-
"plt.imshow(x_train[random_num], cmap=plt.get_cmap('gray'))\n",
|
| 134 |
-
"\n",
|
| 135 |
-
"plt.subplot(332)\n",
|
| 136 |
-
"random_num = np.random.randint(0,len(x_train))\n",
|
| 137 |
-
"plt.imshow(x_train[random_num], cmap=plt.get_cmap('gray'))\n",
|
| 138 |
-
"\n",
|
| 139 |
-
"plt.subplot(333)\n",
|
| 140 |
-
"random_num = np.random.randint(0,len(x_train))\n",
|
| 141 |
-
"plt.imshow(x_train[random_num], cmap=plt.get_cmap('gray'))\n",
|
| 142 |
-
"\n",
|
| 143 |
-
"plt.subplot(334)\n",
|
| 144 |
-
"random_num = np.random.randint(0,len(x_train))\n",
|
| 145 |
-
"plt.imshow(x_train[random_num], cmap=plt.get_cmap('gray'))\n",
|
| 146 |
-
"\n",
|
| 147 |
-
"plt.subplot(335)\n",
|
| 148 |
-
"random_num = np.random.randint(0,len(x_train))\n",
|
| 149 |
-
"plt.imshow(x_train[random_num], cmap=plt.get_cmap('gray'))\n",
|
| 150 |
-
"\n",
|
| 151 |
-
"plt.subplot(336)\n",
|
| 152 |
-
"random_num = np.random.randint(0,len(x_train))\n",
|
| 153 |
-
"plt.imshow(x_train[random_num], cmap=plt.get_cmap('gray'))\n",
|
| 154 |
-
"\n",
|
| 155 |
-
"# Display out plots\n",
|
| 156 |
-
"plt.show()"
|
| 157 |
-
]
|
| 158 |
-
},
|
| 159 |
-
{
|
| 160 |
-
"cell_type": "markdown",
|
| 161 |
-
"metadata": {},
|
| 162 |
-
"source": [
|
| 163 |
-
"### Let's create our model"
|
| 164 |
-
]
|
| 165 |
-
},
|
| 166 |
-
{
|
| 167 |
-
"cell_type": "code",
|
| 168 |
-
"execution_count": 32,
|
| 169 |
-
"metadata": {},
|
| 170 |
-
"outputs": [
|
| 171 |
-
{
|
| 172 |
-
"data": {
|
| 173 |
-
"text/plain": [
|
| 174 |
-
"10"
|
| 175 |
-
]
|
| 176 |
-
},
|
| 177 |
-
"execution_count": 32,
|
| 178 |
-
"metadata": {},
|
| 179 |
-
"output_type": "execute_result"
|
| 180 |
-
}
|
| 181 |
-
],
|
| 182 |
-
"source": [
|
| 183 |
-
"num_classes"
|
| 184 |
-
]
|
| 185 |
-
},
|
| 186 |
-
{
|
| 187 |
-
"cell_type": "code",
|
| 188 |
-
"execution_count": 6,
|
| 189 |
-
"metadata": {},
|
| 190 |
-
"outputs": [
|
| 191 |
-
{
|
| 192 |
-
"name": "stdout",
|
| 193 |
-
"output_type": "stream",
|
| 194 |
-
"text": [
|
| 195 |
-
"x_train shape: (60000, 28, 28, 1)\n",
|
| 196 |
-
"60000 train samples\n",
|
| 197 |
-
"10000 test samples\n",
|
| 198 |
-
"Number of Classes: 10\n",
|
| 199 |
-
"Model: \"sequential\"\n",
|
| 200 |
-
"_________________________________________________________________\n",
|
| 201 |
-
"Layer (type) Output Shape Param # \n",
|
| 202 |
-
"=================================================================\n",
|
| 203 |
-
"conv2d (Conv2D) (None, 26, 26, 32) 320 \n",
|
| 204 |
-
"_________________________________________________________________\n",
|
| 205 |
-
"batch_normalization (BatchNo (None, 26, 26, 32) 128 \n",
|
| 206 |
-
"_________________________________________________________________\n",
|
| 207 |
-
"conv2d_1 (Conv2D) (None, 24, 24, 64) 18496 \n",
|
| 208 |
-
"_________________________________________________________________\n",
|
| 209 |
-
"batch_normalization_1 (Batch (None, 24, 24, 64) 256 \n",
|
| 210 |
-
"_________________________________________________________________\n",
|
| 211 |
-
"max_pooling2d (MaxPooling2D) (None, 12, 12, 64) 0 \n",
|
| 212 |
-
"_________________________________________________________________\n",
|
| 213 |
-
"dropout (Dropout) (None, 12, 12, 64) 0 \n",
|
| 214 |
-
"_________________________________________________________________\n",
|
| 215 |
-
"flatten (Flatten) (None, 9216) 0 \n",
|
| 216 |
-
"_________________________________________________________________\n",
|
| 217 |
-
"dense (Dense) (None, 128) 1179776 \n",
|
| 218 |
-
"_________________________________________________________________\n",
|
| 219 |
-
"batch_normalization_2 (Batch (None, 128) 512 \n",
|
| 220 |
-
"_________________________________________________________________\n",
|
| 221 |
-
"dropout_1 (Dropout) (None, 128) 0 \n",
|
| 222 |
-
"_________________________________________________________________\n",
|
| 223 |
-
"dense_1 (Dense) (None, 10) 1290 \n",
|
| 224 |
-
"_________________________________________________________________\n",
|
| 225 |
-
"activation (Activation) (None, 10) 0 \n",
|
| 226 |
-
"=================================================================\n",
|
| 227 |
-
"Total params: 1,200,778\n",
|
| 228 |
-
"Trainable params: 1,200,330\n",
|
| 229 |
-
"Non-trainable params: 448\n",
|
| 230 |
-
"_________________________________________________________________\n",
|
| 231 |
-
"None\n"
|
| 232 |
-
]
|
| 233 |
-
}
|
| 234 |
-
],
|
| 235 |
-
"source": [
|
| 236 |
-
"from tensorflow.keras.utils import to_categorical\n",
|
| 237 |
-
"import tensorflow as tf\n",
|
| 238 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 239 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 240 |
-
"from tensorflow.keras.layers import Dense, Dropout, Flatten, Activation\n",
|
| 241 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D, BatchNormalization\n",
|
| 242 |
-
"from tensorflow.keras.optimizers import SGD \n",
|
| 243 |
-
"\n",
|
| 244 |
-
"# Training Parameters\n",
|
| 245 |
-
"batch_size = 128\n",
|
| 246 |
-
"epochs = 1\n",
|
| 247 |
-
"\n",
|
| 248 |
-
"# Lets store the number of rows and columns\n",
|
| 249 |
-
"img_rows = x_train[0].shape[0]\n",
|
| 250 |
-
"img_cols = x_train[1].shape[0]\n",
|
| 251 |
-
"\n",
|
| 252 |
-
"# Getting our date in the right 'shape' needed for Keras\n",
|
| 253 |
-
"# We need to add a 4th dimenion to our date thereby changing our\n",
|
| 254 |
-
"# Our original image shape of (60000,28,28) to (60000,28,28,1)\n",
|
| 255 |
-
"x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n",
|
| 256 |
-
"x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n",
|
| 257 |
-
"\n",
|
| 258 |
-
"# store the shape of a single image \n",
|
| 259 |
-
"input_shape = (img_rows, img_cols, 1)\n",
|
| 260 |
-
"\n",
|
| 261 |
-
"# change our image type to float32 data type\n",
|
| 262 |
-
"x_train = x_train.astype('float32')\n",
|
| 263 |
-
"x_test = x_test.astype('float32')\n",
|
| 264 |
-
"\n",
|
| 265 |
-
"# Normalize our data by changing the range from (0 to 255) to (0 to 1)\n",
|
| 266 |
-
"x_train /= 255.0\n",
|
| 267 |
-
"x_test /= 255.0\n",
|
| 268 |
-
"\n",
|
| 269 |
-
"print('x_train shape:', x_train.shape)\n",
|
| 270 |
-
"print(x_train.shape[0], 'train samples')\n",
|
| 271 |
-
"print(x_test.shape[0], 'test samples')\n",
|
| 272 |
-
"\n",
|
| 273 |
-
"# Now we one hot encode outputs\n",
|
| 274 |
-
"y_train = to_categorical(y_train)\n",
|
| 275 |
-
"y_test = to_categorical(y_test)\n",
|
| 276 |
-
"\n",
|
| 277 |
-
"num_classes = y_test.shape[1]\n",
|
| 278 |
-
"# Let's count the number columns in our hot encoded matrix \n",
|
| 279 |
-
"print (\"Number of Classes: \" + str(num_classes))\n",
|
| 280 |
-
"\n",
|
| 281 |
-
"num_pixels = x_train.shape[1] * x_train.shape[2]\n",
|
| 282 |
-
"\n",
|
| 283 |
-
"# create model\n",
|
| 284 |
-
"model = Sequential()\n",
|
| 285 |
-
"\n",
|
| 286 |
-
"model.add(Conv2D(32, kernel_size=(3, 3),\n",
|
| 287 |
-
" activation='relu',\n",
|
| 288 |
-
" input_shape=input_shape))\n",
|
| 289 |
-
"model.add(BatchNormalization())\n",
|
| 290 |
-
"\n",
|
| 291 |
-
"model.add(Conv2D(64, (3, 3), activation='relu'))\n",
|
| 292 |
-
"model.add(BatchNormalization())\n",
|
| 293 |
-
"\n",
|
| 294 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 295 |
-
"model.add(Dropout(0.25))\n",
|
| 296 |
-
"\n",
|
| 297 |
-
"model.add(Flatten())\n",
|
| 298 |
-
"model.add(Dense(128, activation='relu'))\n",
|
| 299 |
-
"model.add(BatchNormalization())\n",
|
| 300 |
-
"\n",
|
| 301 |
-
"model.add(Dropout(0.5))\n",
|
| 302 |
-
"model.add(Dense(num_classes))\n",
|
| 303 |
-
"model.add(Activation('softmax'))\n",
|
| 304 |
-
"\n",
|
| 305 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 306 |
-
" optimizer = SGD(0.01),\n",
|
| 307 |
-
" metrics = ['accuracy'])\n",
|
| 308 |
-
"\n",
|
| 309 |
-
"print(model.summary())"
|
| 310 |
-
]
|
| 311 |
-
},
|
| 312 |
-
{
|
| 313 |
-
"cell_type": "markdown",
|
| 314 |
-
"metadata": {},
|
| 315 |
-
"source": [
|
| 316 |
-
"### Let's train our model"
|
| 317 |
-
]
|
| 318 |
-
},
|
| 319 |
-
{
|
| 320 |
-
"cell_type": "code",
|
| 321 |
-
"execution_count": 7,
|
| 322 |
-
"metadata": {},
|
| 323 |
-
"outputs": [
|
| 324 |
-
{
|
| 325 |
-
"name": "stdout",
|
| 326 |
-
"output_type": "stream",
|
| 327 |
-
"text": [
|
| 328 |
-
"Train on 60000 samples, validate on 10000 samples\n",
|
| 329 |
-
"60000/60000 [==============================] - 150s 3ms/sample - loss: 0.6229 - accuracy: 0.7877 - val_loss: 0.5973 - val_accuracy: 0.8136\n",
|
| 330 |
-
"Test loss: 0.5973387075424195\n",
|
| 331 |
-
"Test accuracy: 0.8136\n"
|
| 332 |
-
]
|
| 333 |
-
}
|
| 334 |
-
],
|
| 335 |
-
"source": [
|
| 336 |
-
"history = model.fit(x_train, y_train,\n",
|
| 337 |
-
" batch_size=batch_size,\n",
|
| 338 |
-
" epochs=epochs,\n",
|
| 339 |
-
" verbose=1,\n",
|
| 340 |
-
" validation_data=(x_test, y_test))\n",
|
| 341 |
-
"\n",
|
| 342 |
-
"score = model.evaluate(x_test, y_test, verbose=0)\n",
|
| 343 |
-
"print('Test loss:', score[0])\n",
|
| 344 |
-
"print('Test accuracy:', score[1])"
|
| 345 |
-
]
|
| 346 |
-
},
|
| 347 |
-
{
|
| 348 |
-
"cell_type": "markdown",
|
| 349 |
-
"metadata": {},
|
| 350 |
-
"source": [
|
| 351 |
-
"### Let's test out our model"
|
| 352 |
-
]
|
| 353 |
-
},
|
| 354 |
-
{
|
| 355 |
-
"cell_type": "code",
|
| 356 |
-
"execution_count": 8,
|
| 357 |
-
"metadata": {},
|
| 358 |
-
"outputs": [],
|
| 359 |
-
"source": [
|
| 360 |
-
"import cv2\n",
|
| 361 |
-
"import numpy as np\n",
|
| 362 |
-
"\n",
|
| 363 |
-
"def getLabel(input_class):\n",
|
| 364 |
-
" number = int(input_class)\n",
|
| 365 |
-
" if number == 0:\n",
|
| 366 |
-
" return \"T-shirt/top \"\n",
|
| 367 |
-
" if number == 1:\n",
|
| 368 |
-
" return \"Trouser\"\n",
|
| 369 |
-
" if number == 2:\n",
|
| 370 |
-
" return \"Pullover\"\n",
|
| 371 |
-
" if number == 3:\n",
|
| 372 |
-
" return \"Dress\"\n",
|
| 373 |
-
" if number == 4:\n",
|
| 374 |
-
" return \"Coat\"\n",
|
| 375 |
-
" if number == 5:\n",
|
| 376 |
-
" return \"Sandal\"\n",
|
| 377 |
-
" if number == 6:\n",
|
| 378 |
-
" return \"Shirt\"\n",
|
| 379 |
-
" if number == 7:\n",
|
| 380 |
-
" return \"Sneaker\"\n",
|
| 381 |
-
" if number == 8:\n",
|
| 382 |
-
" return \"Bag\"\n",
|
| 383 |
-
" if number == 9:\n",
|
| 384 |
-
" return \"Ankle boot\"\n",
|
| 385 |
-
"\n",
|
| 386 |
-
"def draw_test(name, pred, actual, input_im):\n",
|
| 387 |
-
" BLACK = [0,0,0]\n",
|
| 388 |
-
"\n",
|
| 389 |
-
" res = getLabel(pred)\n",
|
| 390 |
-
" actual = getLabel(actual) \n",
|
| 391 |
-
" expanded_image = cv2.copyMakeBorder(input_im, 0, 0, 0, 4*imageL.shape[0] ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 392 |
-
" expanded_image = cv2.cvtColor(expanded_image, cv2.COLOR_GRAY2BGR)\n",
|
| 393 |
-
" cv2.putText(expanded_image, \"Predicted - \" + str(res), (152, 70) , cv2.FONT_HERSHEY_COMPLEX_SMALL,1, (0,255,0), 1)\n",
|
| 394 |
-
" cv2.putText(expanded_image, \" Actual - \" + str(actual), (152, 90) , cv2.FONT_HERSHEY_COMPLEX_SMALL,1, (0,0,255), 1)\n",
|
| 395 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 396 |
-
"\n",
|
| 397 |
-
"\n",
|
| 398 |
-
"for i in range(0,10):\n",
|
| 399 |
-
" rand = np.random.randint(0,len(x_test))\n",
|
| 400 |
-
" input_im = x_test[rand]\n",
|
| 401 |
-
" actual = y_test[rand].argmax(axis=0)\n",
|
| 402 |
-
" imageL = cv2.resize(input_im, None, fx=4, fy=4, interpolation = cv2.INTER_CUBIC)\n",
|
| 403 |
-
" input_im = input_im.reshape(1,28,28,1) \n",
|
| 404 |
-
" \n",
|
| 405 |
-
" ## Get Prediction\n",
|
| 406 |
-
" res = str(model.predict_classes(input_im, 1, verbose = 0)[0])\n",
|
| 407 |
-
"\n",
|
| 408 |
-
" draw_test(\"Prediction\", res, actual, imageL) \n",
|
| 409 |
-
" cv2.waitKey(0)\n",
|
| 410 |
-
"\n",
|
| 411 |
-
"cv2.destroyAllWindows()"
|
| 412 |
-
]
|
| 413 |
-
},
|
| 414 |
-
{
|
| 415 |
-
"cell_type": "code",
|
| 416 |
-
"execution_count": null,
|
| 417 |
-
"metadata": {},
|
| 418 |
-
"outputs": [],
|
| 419 |
-
"source": [
|
| 420 |
-
"\n"
|
| 421 |
-
]
|
| 422 |
-
}
|
| 423 |
-
],
|
| 424 |
-
"metadata": {
|
| 425 |
-
"kernelspec": {
|
| 426 |
-
"display_name": "Python 3",
|
| 427 |
-
"language": "python",
|
| 428 |
-
"name": "python3"
|
| 429 |
-
},
|
| 430 |
-
"language_info": {
|
| 431 |
-
"codemirror_mode": {
|
| 432 |
-
"name": "ipython",
|
| 433 |
-
"version": 3
|
| 434 |
-
},
|
| 435 |
-
"file_extension": ".py",
|
| 436 |
-
"mimetype": "text/x-python",
|
| 437 |
-
"name": "python",
|
| 438 |
-
"nbconvert_exporter": "python",
|
| 439 |
-
"pygments_lexer": "ipython3",
|
| 440 |
-
"version": "3.7.4"
|
| 441 |
-
}
|
| 442 |
-
},
|
| 443 |
-
"nbformat": 4,
|
| 444 |
-
"nbformat_minor": 2
|
| 445 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:7f72f62468d824350f8be3974df4eda6579c69a925f87b76907c1f70f13f0425
|
| 3 |
+
size 26935
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14. ImageNet and Pretrained Models VGG16_ResNet50_InceptionV3/14.1 Experimenting with pre-trained Models in Keras.ipynb
CHANGED
|
@@ -1,227 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Let's start with loading ResNet50 "
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "code",
|
| 12 |
-
"execution_count": 1,
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"outputs": [],
|
| 15 |
-
"source": [
|
| 16 |
-
"from tensorflow.keras.applications.resnet50 import ResNet50\n",
|
| 17 |
-
"from tensorflow.keras.preprocessing import image\n",
|
| 18 |
-
"from tensorflow.keras.applications.resnet50 import preprocess_input, decode_predictions\n",
|
| 19 |
-
"import numpy as np\n",
|
| 20 |
-
"\n",
|
| 21 |
-
"resnet_model = ResNet50(weights='imagenet')"
|
| 22 |
-
]
|
| 23 |
-
},
|
| 24 |
-
{
|
| 25 |
-
"cell_type": "code",
|
| 26 |
-
"execution_count": 2,
|
| 27 |
-
"metadata": {},
|
| 28 |
-
"outputs": [
|
| 29 |
-
{
|
| 30 |
-
"name": "stdout",
|
| 31 |
-
"output_type": "stream",
|
| 32 |
-
"text": [
|
| 33 |
-
"Predicted: [('n02100583', 'vizsla', 0.5282586), ('n02092339', 'Weimaraner', 0.32402116), ('n02099849', 'Chesapeake_Bay_retriever', 0.07540441)]\n"
|
| 34 |
-
]
|
| 35 |
-
}
|
| 36 |
-
],
|
| 37 |
-
"source": [
|
| 38 |
-
"from tensorflow.keras.preprocessing import image\n",
|
| 39 |
-
"\n",
|
| 40 |
-
"img_path = './images/dog.jpg' \n",
|
| 41 |
-
"\n",
|
| 42 |
-
"img = image.load_img(img_path, target_size=(224, 224))\n",
|
| 43 |
-
"x = image.img_to_array(img)\n",
|
| 44 |
-
"x = np.expand_dims(x, axis=0)\n",
|
| 45 |
-
"x = preprocess_input(x)\n",
|
| 46 |
-
"\n",
|
| 47 |
-
"preds = resnet_model.predict(x)\n",
|
| 48 |
-
"# decode the results into a list of tuples (class, description, probability)\n",
|
| 49 |
-
"# (one such list for each sample in the batch)\n",
|
| 50 |
-
"print('Predicted:', decode_predictions(preds, top=3)[0])"
|
| 51 |
-
]
|
| 52 |
-
},
|
| 53 |
-
{
|
| 54 |
-
"cell_type": "markdown",
|
| 55 |
-
"metadata": {},
|
| 56 |
-
"source": [
|
| 57 |
-
"### Let's run through a few test images"
|
| 58 |
-
]
|
| 59 |
-
},
|
| 60 |
-
{
|
| 61 |
-
"cell_type": "code",
|
| 62 |
-
"execution_count": 5,
|
| 63 |
-
"metadata": {},
|
| 64 |
-
"outputs": [],
|
| 65 |
-
"source": [
|
| 66 |
-
"import cv2\n",
|
| 67 |
-
"from os import listdir\n",
|
| 68 |
-
"from os.path import isfile, join\n",
|
| 69 |
-
"\n",
|
| 70 |
-
"# Our openCV function that displays the image and it's predicted labels \n",
|
| 71 |
-
"def draw_test(name, preditions, input_im):\n",
|
| 72 |
-
" \"\"\"Function displays the output of the prediction alongside the orignal image\"\"\"\n",
|
| 73 |
-
" BLACK = [0,0,0]\n",
|
| 74 |
-
" expanded_image = cv2.copyMakeBorder(input_im, 0, 0, 0, imageL.shape[1]+300 ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 75 |
-
" img_width = input_im.shape[1]\n",
|
| 76 |
-
" for (i,predition) in enumerate(preditions):\n",
|
| 77 |
-
" string = str(predition[1]) + \" \" + str(predition[2])\n",
|
| 78 |
-
" cv2.putText(expanded_image,str(name),(img_width + 50,50),cv2.FONT_HERSHEY_COMPLEX_SMALL,2,(0,0,255),1)\n",
|
| 79 |
-
" cv2.putText(expanded_image,string,(img_width + 50,50+((i+1)*50)),cv2.FONT_HERSHEY_COMPLEX_SMALL,2,(0,255,0),1)\n",
|
| 80 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 81 |
-
"\n",
|
| 82 |
-
"# Get images located in ./images folder \n",
|
| 83 |
-
"mypath = \"./images/\"\n",
|
| 84 |
-
"file_names = [f for f in listdir(mypath) if isfile(join(mypath, f))]\n",
|
| 85 |
-
"\n",
|
| 86 |
-
"# Loop through images run them through our classifer\n",
|
| 87 |
-
"for file in file_names:\n",
|
| 88 |
-
"\n",
|
| 89 |
-
" from tensorflow.keras.preprocessing import image # Need to reload as opencv2 seems to have a conflict\n",
|
| 90 |
-
" img = image.load_img(mypath+file, target_size=(224, 224))\n",
|
| 91 |
-
" x = image.img_to_array(img)\n",
|
| 92 |
-
" x = np.expand_dims(x, axis=0)\n",
|
| 93 |
-
" x = preprocess_input(x)\n",
|
| 94 |
-
" \n",
|
| 95 |
-
" #load image using opencv\n",
|
| 96 |
-
" img2 = cv2.imread(mypath+file)\n",
|
| 97 |
-
" imageL = cv2.resize(img2, None, fx=.5, fy=.5, interpolation = cv2.INTER_CUBIC) \n",
|
| 98 |
-
" \n",
|
| 99 |
-
" # Get Predictions\n",
|
| 100 |
-
" preds = resnet_model.predict(x)\n",
|
| 101 |
-
" preditions = decode_predictions(preds, top=3)[0]\n",
|
| 102 |
-
" draw_test(\"Predictions\", preditions, imageL) \n",
|
| 103 |
-
" cv2.waitKey(0)\n",
|
| 104 |
-
"\n",
|
| 105 |
-
"cv2.destroyAllWindows()"
|
| 106 |
-
]
|
| 107 |
-
},
|
| 108 |
-
{
|
| 109 |
-
"cell_type": "markdown",
|
| 110 |
-
"metadata": {},
|
| 111 |
-
"source": [
|
| 112 |
-
"### Let's now load VGG16 and InceptionV3"
|
| 113 |
-
]
|
| 114 |
-
},
|
| 115 |
-
{
|
| 116 |
-
"cell_type": "code",
|
| 117 |
-
"execution_count": 6,
|
| 118 |
-
"metadata": {},
|
| 119 |
-
"outputs": [],
|
| 120 |
-
"source": [
|
| 121 |
-
"import tensorflow as tf\n",
|
| 122 |
-
"import numpy as np\n",
|
| 123 |
-
"from tensorflow.keras.applications import vgg16, inception_v3, resnet50\n",
|
| 124 |
-
" \n",
|
| 125 |
-
"#Loads the VGG16 model\n",
|
| 126 |
-
"vgg_model = vgg16.VGG16(weights='imagenet')\n",
|
| 127 |
-
" \n",
|
| 128 |
-
"# Loads the Inception_V3 model\n",
|
| 129 |
-
"inception_model = inception_v3.InceptionV3(weights='imagenet')\n",
|
| 130 |
-
" \n",
|
| 131 |
-
"# Loads the ResNet50 model \n",
|
| 132 |
-
"# uncomment the line below if you didn't load resnet50 beforehand\n",
|
| 133 |
-
"#resnet_model = resnet50.ResNet50(weights='imagenet')"
|
| 134 |
-
]
|
| 135 |
-
},
|
| 136 |
-
{
|
| 137 |
-
"cell_type": "markdown",
|
| 138 |
-
"metadata": {},
|
| 139 |
-
"source": [
|
| 140 |
-
"### Compare all 3 Models with the same test images"
|
| 141 |
-
]
|
| 142 |
-
},
|
| 143 |
-
{
|
| 144 |
-
"cell_type": "code",
|
| 145 |
-
"execution_count": 24,
|
| 146 |
-
"metadata": {},
|
| 147 |
-
"outputs": [],
|
| 148 |
-
"source": [
|
| 149 |
-
"def getImage(path, dim=224, inception = False):\n",
|
| 150 |
-
" img = image.load_img(path, target_size=(dim, dim))\n",
|
| 151 |
-
" x = image.img_to_array(img)\n",
|
| 152 |
-
" x = np.expand_dims(x, axis=0)\n",
|
| 153 |
-
" if inception:\n",
|
| 154 |
-
" x /= 255.\n",
|
| 155 |
-
" x -= 0.5\n",
|
| 156 |
-
" x *= 2.\n",
|
| 157 |
-
" else:\n",
|
| 158 |
-
" x = preprocess_input(x)\n",
|
| 159 |
-
" return x"
|
| 160 |
-
]
|
| 161 |
-
},
|
| 162 |
-
{
|
| 163 |
-
"cell_type": "code",
|
| 164 |
-
"execution_count": 25,
|
| 165 |
-
"metadata": {},
|
| 166 |
-
"outputs": [],
|
| 167 |
-
"source": [
|
| 168 |
-
"# Get images located in ./images folder \n",
|
| 169 |
-
"mypath = \"./images/\"\n",
|
| 170 |
-
"file_names = [f for f in listdir(mypath) if isfile(join(mypath, f))]\n",
|
| 171 |
-
"\n",
|
| 172 |
-
"# Loop through images run them through our classifer\n",
|
| 173 |
-
"for file in file_names:\n",
|
| 174 |
-
"\n",
|
| 175 |
-
" from tensorflow.keras.preprocessing import image # Need to reload as opencv2 seems to have a conflict\n",
|
| 176 |
-
" #img = image.load_img(mypath+file, target_size=(dim, dim))\n",
|
| 177 |
-
" x = getImage(mypath+file, 229)\n",
|
| 178 |
-
" #load image using opencv\n",
|
| 179 |
-
" img2 = cv2.imread(mypath+file)\n",
|
| 180 |
-
" imageL = cv2.resize(img2, None, fx=.5, fy=.5, interpolation = cv2.INTER_CUBIC) \n",
|
| 181 |
-
" \n",
|
| 182 |
-
" # Get VGG16 Predictions\n",
|
| 183 |
-
" x = getImage(mypath+file, 224)\n",
|
| 184 |
-
" preds_vgg_model = vgg_model.predict(x)\n",
|
| 185 |
-
" preditions_vgg = decode_predictions(preds_vgg_model, top=3)[0]\n",
|
| 186 |
-
" draw_test(\"VGG16 Predictions\", preditions_vgg, imageL) \n",
|
| 187 |
-
" \n",
|
| 188 |
-
" # Get Inception_V3 Predictions\n",
|
| 189 |
-
" x = getImage(mypath+file, 299, inception = True)\n",
|
| 190 |
-
" preds_inception = inception_model.predict(x)\n",
|
| 191 |
-
" preditions_inception = decode_predictions(preds_inception, top=3)[0]\n",
|
| 192 |
-
" draw_test(\"Inception_V3 Predictions\", preditions_inception, imageL) \n",
|
| 193 |
-
"\n",
|
| 194 |
-
" # Get ResNet50 Predictions\n",
|
| 195 |
-
" x = getImage(mypath+file, 224)\n",
|
| 196 |
-
" preds_resnet = resnet_model.predict(x)\n",
|
| 197 |
-
" preditions_resnet = decode_predictions(preds_resnet, top=3)[0]\n",
|
| 198 |
-
" draw_test(\"ResNet50 Predictions\", preditions_resnet, imageL) \n",
|
| 199 |
-
" \n",
|
| 200 |
-
" cv2.waitKey(0)\n",
|
| 201 |
-
"\n",
|
| 202 |
-
"cv2.destroyAllWindows()\n"
|
| 203 |
-
]
|
| 204 |
-
}
|
| 205 |
-
],
|
| 206 |
-
"metadata": {
|
| 207 |
-
"kernelspec": {
|
| 208 |
-
"display_name": "Python 3",
|
| 209 |
-
"language": "python",
|
| 210 |
-
"name": "python3"
|
| 211 |
-
},
|
| 212 |
-
"language_info": {
|
| 213 |
-
"codemirror_mode": {
|
| 214 |
-
"name": "ipython",
|
| 215 |
-
"version": 3
|
| 216 |
-
},
|
| 217 |
-
"file_extension": ".py",
|
| 218 |
-
"mimetype": "text/x-python",
|
| 219 |
-
"name": "python",
|
| 220 |
-
"nbconvert_exporter": "python",
|
| 221 |
-
"pygments_lexer": "ipython3",
|
| 222 |
-
"version": "3.7.4"
|
| 223 |
-
}
|
| 224 |
-
},
|
| 225 |
-
"nbformat": 4,
|
| 226 |
-
"nbformat_minor": 2
|
| 227 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:ad641d6516f4e7974dfe8149b95fabfaf34b8f1cd7f5b4373365177ec7e3c792
|
| 3 |
+
size 7394
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
15. Transfer Learning & Fine Tuning/15.2 Using MobileNet to make a Monkey Breed Classifier.ipynb
CHANGED
|
@@ -1,657 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Using MobileNet for our Monkey Classifer\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"### Loading the MobileNet Model"
|
| 10 |
-
]
|
| 11 |
-
},
|
| 12 |
-
{
|
| 13 |
-
"cell_type": "markdown",
|
| 14 |
-
"metadata": {},
|
| 15 |
-
"source": [
|
| 16 |
-
"Freeze all layers except the top 4, as we'll only be training the top 4"
|
| 17 |
-
]
|
| 18 |
-
},
|
| 19 |
-
{
|
| 20 |
-
"cell_type": "code",
|
| 21 |
-
"execution_count": 1,
|
| 22 |
-
"metadata": {},
|
| 23 |
-
"outputs": [
|
| 24 |
-
{
|
| 25 |
-
"name": "stdout",
|
| 26 |
-
"output_type": "stream",
|
| 27 |
-
"text": [
|
| 28 |
-
"Downloading data from https://github.com/fchollet/deep-learning-models/releases/download/v0.6/mobilenet_1_0_224_tf_no_top.h5\n",
|
| 29 |
-
"17227776/17225924 [==============================] - 14s 1us/step\n",
|
| 30 |
-
"0 InputLayer False\n",
|
| 31 |
-
"1 ZeroPadding2D False\n",
|
| 32 |
-
"2 Conv2D False\n",
|
| 33 |
-
"3 BatchNormalization False\n",
|
| 34 |
-
"4 ReLU False\n",
|
| 35 |
-
"5 DepthwiseConv2D False\n",
|
| 36 |
-
"6 BatchNormalization False\n",
|
| 37 |
-
"7 ReLU False\n",
|
| 38 |
-
"8 Conv2D False\n",
|
| 39 |
-
"9 BatchNormalization False\n",
|
| 40 |
-
"10 ReLU False\n",
|
| 41 |
-
"11 ZeroPadding2D False\n",
|
| 42 |
-
"12 DepthwiseConv2D False\n",
|
| 43 |
-
"13 BatchNormalization False\n",
|
| 44 |
-
"14 ReLU False\n",
|
| 45 |
-
"15 Conv2D False\n",
|
| 46 |
-
"16 BatchNormalization False\n",
|
| 47 |
-
"17 ReLU False\n",
|
| 48 |
-
"18 DepthwiseConv2D False\n",
|
| 49 |
-
"19 BatchNormalization False\n",
|
| 50 |
-
"20 ReLU False\n",
|
| 51 |
-
"21 Conv2D False\n",
|
| 52 |
-
"22 BatchNormalization False\n",
|
| 53 |
-
"23 ReLU False\n",
|
| 54 |
-
"24 ZeroPadding2D False\n",
|
| 55 |
-
"25 DepthwiseConv2D False\n",
|
| 56 |
-
"26 BatchNormalization False\n",
|
| 57 |
-
"27 ReLU False\n",
|
| 58 |
-
"28 Conv2D False\n",
|
| 59 |
-
"29 BatchNormalization False\n",
|
| 60 |
-
"30 ReLU False\n",
|
| 61 |
-
"31 DepthwiseConv2D False\n",
|
| 62 |
-
"32 BatchNormalization False\n",
|
| 63 |
-
"33 ReLU False\n",
|
| 64 |
-
"34 Conv2D False\n",
|
| 65 |
-
"35 BatchNormalization False\n",
|
| 66 |
-
"36 ReLU False\n",
|
| 67 |
-
"37 ZeroPadding2D False\n",
|
| 68 |
-
"38 DepthwiseConv2D False\n",
|
| 69 |
-
"39 BatchNormalization False\n",
|
| 70 |
-
"40 ReLU False\n",
|
| 71 |
-
"41 Conv2D False\n",
|
| 72 |
-
"42 BatchNormalization False\n",
|
| 73 |
-
"43 ReLU False\n",
|
| 74 |
-
"44 DepthwiseConv2D False\n",
|
| 75 |
-
"45 BatchNormalization False\n",
|
| 76 |
-
"46 ReLU False\n",
|
| 77 |
-
"47 Conv2D False\n",
|
| 78 |
-
"48 BatchNormalization False\n",
|
| 79 |
-
"49 ReLU False\n",
|
| 80 |
-
"50 DepthwiseConv2D False\n",
|
| 81 |
-
"51 BatchNormalization False\n",
|
| 82 |
-
"52 ReLU False\n",
|
| 83 |
-
"53 Conv2D False\n",
|
| 84 |
-
"54 BatchNormalization False\n",
|
| 85 |
-
"55 ReLU False\n",
|
| 86 |
-
"56 DepthwiseConv2D False\n",
|
| 87 |
-
"57 BatchNormalization False\n",
|
| 88 |
-
"58 ReLU False\n",
|
| 89 |
-
"59 Conv2D False\n",
|
| 90 |
-
"60 BatchNormalization False\n",
|
| 91 |
-
"61 ReLU False\n",
|
| 92 |
-
"62 DepthwiseConv2D False\n",
|
| 93 |
-
"63 BatchNormalization False\n",
|
| 94 |
-
"64 ReLU False\n",
|
| 95 |
-
"65 Conv2D False\n",
|
| 96 |
-
"66 BatchNormalization False\n",
|
| 97 |
-
"67 ReLU False\n",
|
| 98 |
-
"68 DepthwiseConv2D False\n",
|
| 99 |
-
"69 BatchNormalization False\n",
|
| 100 |
-
"70 ReLU False\n",
|
| 101 |
-
"71 Conv2D False\n",
|
| 102 |
-
"72 BatchNormalization False\n",
|
| 103 |
-
"73 ReLU False\n",
|
| 104 |
-
"74 ZeroPadding2D False\n",
|
| 105 |
-
"75 DepthwiseConv2D False\n",
|
| 106 |
-
"76 BatchNormalization False\n",
|
| 107 |
-
"77 ReLU False\n",
|
| 108 |
-
"78 Conv2D False\n",
|
| 109 |
-
"79 BatchNormalization False\n",
|
| 110 |
-
"80 ReLU False\n",
|
| 111 |
-
"81 DepthwiseConv2D False\n",
|
| 112 |
-
"82 BatchNormalization False\n",
|
| 113 |
-
"83 ReLU False\n",
|
| 114 |
-
"84 Conv2D False\n",
|
| 115 |
-
"85 BatchNormalization False\n",
|
| 116 |
-
"86 ReLU False\n"
|
| 117 |
-
]
|
| 118 |
-
}
|
| 119 |
-
],
|
| 120 |
-
"source": [
|
| 121 |
-
"from tensorflow.keras.applications import MobileNet\n",
|
| 122 |
-
"\n",
|
| 123 |
-
"# MobileNet was designed to work on 224 x 224 pixel input images sizes\n",
|
| 124 |
-
"img_rows, img_cols = 224, 224 \n",
|
| 125 |
-
"\n",
|
| 126 |
-
"# Re-loads the MobileNet model without the top or FC layers\n",
|
| 127 |
-
"MobileNet = MobileNet(weights = 'imagenet', \n",
|
| 128 |
-
" include_top = False, \n",
|
| 129 |
-
" input_shape = (img_rows, img_cols, 3))\n",
|
| 130 |
-
"\n",
|
| 131 |
-
"# Here we freeze the last 4 layers \n",
|
| 132 |
-
"# Layers are set to trainable as True by default\n",
|
| 133 |
-
"for layer in MobileNet.layers:\n",
|
| 134 |
-
" layer.trainable = False\n",
|
| 135 |
-
" \n",
|
| 136 |
-
"# Let's print our layers \n",
|
| 137 |
-
"for (i,layer) in enumerate(MobileNet.layers):\n",
|
| 138 |
-
" print(str(i) + \" \"+ layer.__class__.__name__, layer.trainable)"
|
| 139 |
-
]
|
| 140 |
-
},
|
| 141 |
-
{
|
| 142 |
-
"cell_type": "markdown",
|
| 143 |
-
"metadata": {},
|
| 144 |
-
"source": [
|
| 145 |
-
"### Let's make a function that returns our FC Head"
|
| 146 |
-
]
|
| 147 |
-
},
|
| 148 |
-
{
|
| 149 |
-
"cell_type": "code",
|
| 150 |
-
"execution_count": 2,
|
| 151 |
-
"metadata": {},
|
| 152 |
-
"outputs": [],
|
| 153 |
-
"source": [
|
| 154 |
-
"def addTopModelMobileNet(bottom_model, num_classes):\n",
|
| 155 |
-
" \"\"\"creates the top or head of the model that will be \n",
|
| 156 |
-
" placed ontop of the bottom layers\"\"\"\n",
|
| 157 |
-
"\n",
|
| 158 |
-
" top_model = bottom_model.output\n",
|
| 159 |
-
" top_model = GlobalAveragePooling2D()(top_model)\n",
|
| 160 |
-
" top_model = Dense(1024,activation='relu')(top_model)\n",
|
| 161 |
-
" top_model = Dense(1024,activation='relu')(top_model)\n",
|
| 162 |
-
" top_model = Dense(512,activation='relu')(top_model)\n",
|
| 163 |
-
" top_model = Dense(num_classes,activation='softmax')(top_model)\n",
|
| 164 |
-
" return top_model"
|
| 165 |
-
]
|
| 166 |
-
},
|
| 167 |
-
{
|
| 168 |
-
"cell_type": "markdown",
|
| 169 |
-
"metadata": {},
|
| 170 |
-
"source": [
|
| 171 |
-
"### Let's add our FC Head back onto MobileNet"
|
| 172 |
-
]
|
| 173 |
-
},
|
| 174 |
-
{
|
| 175 |
-
"cell_type": "code",
|
| 176 |
-
"execution_count": 5,
|
| 177 |
-
"metadata": {
|
| 178 |
-
"scrolled": true
|
| 179 |
-
},
|
| 180 |
-
"outputs": [
|
| 181 |
-
{
|
| 182 |
-
"name": "stdout",
|
| 183 |
-
"output_type": "stream",
|
| 184 |
-
"text": [
|
| 185 |
-
"Model: \"model_1\"\n",
|
| 186 |
-
"_________________________________________________________________\n",
|
| 187 |
-
"Layer (type) Output Shape Param # \n",
|
| 188 |
-
"=================================================================\n",
|
| 189 |
-
"input_1 (InputLayer) [(None, 224, 224, 3)] 0 \n",
|
| 190 |
-
"_________________________________________________________________\n",
|
| 191 |
-
"conv1_pad (ZeroPadding2D) (None, 225, 225, 3) 0 \n",
|
| 192 |
-
"_________________________________________________________________\n",
|
| 193 |
-
"conv1 (Conv2D) (None, 112, 112, 32) 864 \n",
|
| 194 |
-
"_________________________________________________________________\n",
|
| 195 |
-
"conv1_bn (BatchNormalization (None, 112, 112, 32) 128 \n",
|
| 196 |
-
"_________________________________________________________________\n",
|
| 197 |
-
"conv1_relu (ReLU) (None, 112, 112, 32) 0 \n",
|
| 198 |
-
"_________________________________________________________________\n",
|
| 199 |
-
"conv_dw_1 (DepthwiseConv2D) (None, 112, 112, 32) 288 \n",
|
| 200 |
-
"_________________________________________________________________\n",
|
| 201 |
-
"conv_dw_1_bn (BatchNormaliza (None, 112, 112, 32) 128 \n",
|
| 202 |
-
"_________________________________________________________________\n",
|
| 203 |
-
"conv_dw_1_relu (ReLU) (None, 112, 112, 32) 0 \n",
|
| 204 |
-
"_________________________________________________________________\n",
|
| 205 |
-
"conv_pw_1 (Conv2D) (None, 112, 112, 64) 2048 \n",
|
| 206 |
-
"_________________________________________________________________\n",
|
| 207 |
-
"conv_pw_1_bn (BatchNormaliza (None, 112, 112, 64) 256 \n",
|
| 208 |
-
"_________________________________________________________________\n",
|
| 209 |
-
"conv_pw_1_relu (ReLU) (None, 112, 112, 64) 0 \n",
|
| 210 |
-
"_________________________________________________________________\n",
|
| 211 |
-
"conv_pad_2 (ZeroPadding2D) (None, 113, 113, 64) 0 \n",
|
| 212 |
-
"_________________________________________________________________\n",
|
| 213 |
-
"conv_dw_2 (DepthwiseConv2D) (None, 56, 56, 64) 576 \n",
|
| 214 |
-
"_________________________________________________________________\n",
|
| 215 |
-
"conv_dw_2_bn (BatchNormaliza (None, 56, 56, 64) 256 \n",
|
| 216 |
-
"_________________________________________________________________\n",
|
| 217 |
-
"conv_dw_2_relu (ReLU) (None, 56, 56, 64) 0 \n",
|
| 218 |
-
"_________________________________________________________________\n",
|
| 219 |
-
"conv_pw_2 (Conv2D) (None, 56, 56, 128) 8192 \n",
|
| 220 |
-
"_________________________________________________________________\n",
|
| 221 |
-
"conv_pw_2_bn (BatchNormaliza (None, 56, 56, 128) 512 \n",
|
| 222 |
-
"_________________________________________________________________\n",
|
| 223 |
-
"conv_pw_2_relu (ReLU) (None, 56, 56, 128) 0 \n",
|
| 224 |
-
"_________________________________________________________________\n",
|
| 225 |
-
"conv_dw_3 (DepthwiseConv2D) (None, 56, 56, 128) 1152 \n",
|
| 226 |
-
"_________________________________________________________________\n",
|
| 227 |
-
"conv_dw_3_bn (BatchNormaliza (None, 56, 56, 128) 512 \n",
|
| 228 |
-
"_________________________________________________________________\n",
|
| 229 |
-
"conv_dw_3_relu (ReLU) (None, 56, 56, 128) 0 \n",
|
| 230 |
-
"_________________________________________________________________\n",
|
| 231 |
-
"conv_pw_3 (Conv2D) (None, 56, 56, 128) 16384 \n",
|
| 232 |
-
"_________________________________________________________________\n",
|
| 233 |
-
"conv_pw_3_bn (BatchNormaliza (None, 56, 56, 128) 512 \n",
|
| 234 |
-
"_________________________________________________________________\n",
|
| 235 |
-
"conv_pw_3_relu (ReLU) (None, 56, 56, 128) 0 \n",
|
| 236 |
-
"_________________________________________________________________\n",
|
| 237 |
-
"conv_pad_4 (ZeroPadding2D) (None, 57, 57, 128) 0 \n",
|
| 238 |
-
"_________________________________________________________________\n",
|
| 239 |
-
"conv_dw_4 (DepthwiseConv2D) (None, 28, 28, 128) 1152 \n",
|
| 240 |
-
"_________________________________________________________________\n",
|
| 241 |
-
"conv_dw_4_bn (BatchNormaliza (None, 28, 28, 128) 512 \n",
|
| 242 |
-
"_________________________________________________________________\n",
|
| 243 |
-
"conv_dw_4_relu (ReLU) (None, 28, 28, 128) 0 \n",
|
| 244 |
-
"_________________________________________________________________\n",
|
| 245 |
-
"conv_pw_4 (Conv2D) (None, 28, 28, 256) 32768 \n",
|
| 246 |
-
"_________________________________________________________________\n",
|
| 247 |
-
"conv_pw_4_bn (BatchNormaliza (None, 28, 28, 256) 1024 \n",
|
| 248 |
-
"_________________________________________________________________\n",
|
| 249 |
-
"conv_pw_4_relu (ReLU) (None, 28, 28, 256) 0 \n",
|
| 250 |
-
"_________________________________________________________________\n",
|
| 251 |
-
"conv_dw_5 (DepthwiseConv2D) (None, 28, 28, 256) 2304 \n",
|
| 252 |
-
"_________________________________________________________________\n",
|
| 253 |
-
"conv_dw_5_bn (BatchNormaliza (None, 28, 28, 256) 1024 \n",
|
| 254 |
-
"_________________________________________________________________\n",
|
| 255 |
-
"conv_dw_5_relu (ReLU) (None, 28, 28, 256) 0 \n",
|
| 256 |
-
"_________________________________________________________________\n",
|
| 257 |
-
"conv_pw_5 (Conv2D) (None, 28, 28, 256) 65536 \n",
|
| 258 |
-
"_________________________________________________________________\n",
|
| 259 |
-
"conv_pw_5_bn (BatchNormaliza (None, 28, 28, 256) 1024 \n",
|
| 260 |
-
"_________________________________________________________________\n",
|
| 261 |
-
"conv_pw_5_relu (ReLU) (None, 28, 28, 256) 0 \n",
|
| 262 |
-
"_________________________________________________________________\n",
|
| 263 |
-
"conv_pad_6 (ZeroPadding2D) (None, 29, 29, 256) 0 \n",
|
| 264 |
-
"_________________________________________________________________\n",
|
| 265 |
-
"conv_dw_6 (DepthwiseConv2D) (None, 14, 14, 256) 2304 \n",
|
| 266 |
-
"_________________________________________________________________\n",
|
| 267 |
-
"conv_dw_6_bn (BatchNormaliza (None, 14, 14, 256) 1024 \n",
|
| 268 |
-
"_________________________________________________________________\n",
|
| 269 |
-
"conv_dw_6_relu (ReLU) (None, 14, 14, 256) 0 \n",
|
| 270 |
-
"_________________________________________________________________\n",
|
| 271 |
-
"conv_pw_6 (Conv2D) (None, 14, 14, 512) 131072 \n",
|
| 272 |
-
"_________________________________________________________________\n",
|
| 273 |
-
"conv_pw_6_bn (BatchNormaliza (None, 14, 14, 512) 2048 \n",
|
| 274 |
-
"_________________________________________________________________\n",
|
| 275 |
-
"conv_pw_6_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 276 |
-
"_________________________________________________________________\n",
|
| 277 |
-
"conv_dw_7 (DepthwiseConv2D) (None, 14, 14, 512) 4608 \n",
|
| 278 |
-
"_________________________________________________________________\n",
|
| 279 |
-
"conv_dw_7_bn (BatchNormaliza (None, 14, 14, 512) 2048 \n",
|
| 280 |
-
"_________________________________________________________________\n",
|
| 281 |
-
"conv_dw_7_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 282 |
-
"_________________________________________________________________\n",
|
| 283 |
-
"conv_pw_7 (Conv2D) (None, 14, 14, 512) 262144 \n",
|
| 284 |
-
"_________________________________________________________________\n",
|
| 285 |
-
"conv_pw_7_bn (BatchNormaliza (None, 14, 14, 512) 2048 \n",
|
| 286 |
-
"_________________________________________________________________\n",
|
| 287 |
-
"conv_pw_7_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 288 |
-
"_________________________________________________________________\n",
|
| 289 |
-
"conv_dw_8 (DepthwiseConv2D) (None, 14, 14, 512) 4608 \n",
|
| 290 |
-
"_________________________________________________________________\n",
|
| 291 |
-
"conv_dw_8_bn (BatchNormaliza (None, 14, 14, 512) 2048 \n",
|
| 292 |
-
"_________________________________________________________________\n",
|
| 293 |
-
"conv_dw_8_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 294 |
-
"_________________________________________________________________\n",
|
| 295 |
-
"conv_pw_8 (Conv2D) (None, 14, 14, 512) 262144 \n",
|
| 296 |
-
"_________________________________________________________________\n",
|
| 297 |
-
"conv_pw_8_bn (BatchNormaliza (None, 14, 14, 512) 2048 \n",
|
| 298 |
-
"_________________________________________________________________\n",
|
| 299 |
-
"conv_pw_8_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 300 |
-
"_________________________________________________________________\n",
|
| 301 |
-
"conv_dw_9 (DepthwiseConv2D) (None, 14, 14, 512) 4608 \n",
|
| 302 |
-
"_________________________________________________________________\n",
|
| 303 |
-
"conv_dw_9_bn (BatchNormaliza (None, 14, 14, 512) 2048 \n",
|
| 304 |
-
"_________________________________________________________________\n",
|
| 305 |
-
"conv_dw_9_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 306 |
-
"_________________________________________________________________\n",
|
| 307 |
-
"conv_pw_9 (Conv2D) (None, 14, 14, 512) 262144 \n",
|
| 308 |
-
"_________________________________________________________________\n",
|
| 309 |
-
"conv_pw_9_bn (BatchNormaliza (None, 14, 14, 512) 2048 \n",
|
| 310 |
-
"_________________________________________________________________\n",
|
| 311 |
-
"conv_pw_9_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 312 |
-
"_________________________________________________________________\n",
|
| 313 |
-
"conv_dw_10 (DepthwiseConv2D) (None, 14, 14, 512) 4608 \n",
|
| 314 |
-
"_________________________________________________________________\n",
|
| 315 |
-
"conv_dw_10_bn (BatchNormaliz (None, 14, 14, 512) 2048 \n",
|
| 316 |
-
"_________________________________________________________________\n",
|
| 317 |
-
"conv_dw_10_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 318 |
-
"_________________________________________________________________\n",
|
| 319 |
-
"conv_pw_10 (Conv2D) (None, 14, 14, 512) 262144 \n",
|
| 320 |
-
"_________________________________________________________________\n",
|
| 321 |
-
"conv_pw_10_bn (BatchNormaliz (None, 14, 14, 512) 2048 \n",
|
| 322 |
-
"_________________________________________________________________\n",
|
| 323 |
-
"conv_pw_10_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 324 |
-
"_________________________________________________________________\n",
|
| 325 |
-
"conv_dw_11 (DepthwiseConv2D) (None, 14, 14, 512) 4608 \n",
|
| 326 |
-
"_________________________________________________________________\n",
|
| 327 |
-
"conv_dw_11_bn (BatchNormaliz (None, 14, 14, 512) 2048 \n",
|
| 328 |
-
"_________________________________________________________________\n",
|
| 329 |
-
"conv_dw_11_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 330 |
-
"_________________________________________________________________\n",
|
| 331 |
-
"conv_pw_11 (Conv2D) (None, 14, 14, 512) 262144 \n",
|
| 332 |
-
"_________________________________________________________________\n",
|
| 333 |
-
"conv_pw_11_bn (BatchNormaliz (None, 14, 14, 512) 2048 \n",
|
| 334 |
-
"_________________________________________________________________\n",
|
| 335 |
-
"conv_pw_11_relu (ReLU) (None, 14, 14, 512) 0 \n",
|
| 336 |
-
"_________________________________________________________________\n",
|
| 337 |
-
"conv_pad_12 (ZeroPadding2D) (None, 15, 15, 512) 0 \n",
|
| 338 |
-
"_________________________________________________________________\n",
|
| 339 |
-
"conv_dw_12 (DepthwiseConv2D) (None, 7, 7, 512) 4608 \n",
|
| 340 |
-
"_________________________________________________________________\n",
|
| 341 |
-
"conv_dw_12_bn (BatchNormaliz (None, 7, 7, 512) 2048 \n",
|
| 342 |
-
"_________________________________________________________________\n",
|
| 343 |
-
"conv_dw_12_relu (ReLU) (None, 7, 7, 512) 0 \n",
|
| 344 |
-
"_________________________________________________________________\n",
|
| 345 |
-
"conv_pw_12 (Conv2D) (None, 7, 7, 1024) 524288 \n",
|
| 346 |
-
"_________________________________________________________________\n",
|
| 347 |
-
"conv_pw_12_bn (BatchNormaliz (None, 7, 7, 1024) 4096 \n",
|
| 348 |
-
"_________________________________________________________________\n",
|
| 349 |
-
"conv_pw_12_relu (ReLU) (None, 7, 7, 1024) 0 \n",
|
| 350 |
-
"_________________________________________________________________\n",
|
| 351 |
-
"conv_dw_13 (DepthwiseConv2D) (None, 7, 7, 1024) 9216 \n",
|
| 352 |
-
"_________________________________________________________________\n",
|
| 353 |
-
"conv_dw_13_bn (BatchNormaliz (None, 7, 7, 1024) 4096 \n",
|
| 354 |
-
"_________________________________________________________________\n",
|
| 355 |
-
"conv_dw_13_relu (ReLU) (None, 7, 7, 1024) 0 \n",
|
| 356 |
-
"_________________________________________________________________\n",
|
| 357 |
-
"conv_pw_13 (Conv2D) (None, 7, 7, 1024) 1048576 \n",
|
| 358 |
-
"_________________________________________________________________\n",
|
| 359 |
-
"conv_pw_13_bn (BatchNormaliz (None, 7, 7, 1024) 4096 \n",
|
| 360 |
-
"_________________________________________________________________\n",
|
| 361 |
-
"conv_pw_13_relu (ReLU) (None, 7, 7, 1024) 0 \n",
|
| 362 |
-
"_________________________________________________________________\n",
|
| 363 |
-
"global_average_pooling2d_1 ( (None, 1024) 0 \n",
|
| 364 |
-
"_________________________________________________________________\n",
|
| 365 |
-
"dense_4 (Dense) (None, 1024) 1049600 \n",
|
| 366 |
-
"_________________________________________________________________\n",
|
| 367 |
-
"dense_5 (Dense) (None, 1024) 1049600 \n",
|
| 368 |
-
"_________________________________________________________________\n",
|
| 369 |
-
"dense_6 (Dense) (None, 512) 524800 \n",
|
| 370 |
-
"_________________________________________________________________\n",
|
| 371 |
-
"dense_7 (Dense) (None, 10) 5130 \n",
|
| 372 |
-
"=================================================================\n",
|
| 373 |
-
"Total params: 5,857,994\n",
|
| 374 |
-
"Trainable params: 2,629,130\n",
|
| 375 |
-
"Non-trainable params: 3,228,864\n",
|
| 376 |
-
"_________________________________________________________________\n",
|
| 377 |
-
"None\n"
|
| 378 |
-
]
|
| 379 |
-
}
|
| 380 |
-
],
|
| 381 |
-
"source": [
|
| 382 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 383 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten, GlobalAveragePooling2D\n",
|
| 384 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D, ZeroPadding2D\n",
|
| 385 |
-
"from tensorflow.keras.layers import BatchNormalization\n",
|
| 386 |
-
"from tensorflow.keras.models import Model\n",
|
| 387 |
-
"\n",
|
| 388 |
-
"# Set our class number to 3 (Young, Middle, Old)\n",
|
| 389 |
-
"num_classes = 10\n",
|
| 390 |
-
"\n",
|
| 391 |
-
"FC_Head = addTopModelMobileNet(MobileNet, num_classes)\n",
|
| 392 |
-
"\n",
|
| 393 |
-
"model = Model(inputs = MobileNet.input, outputs = FC_Head)\n",
|
| 394 |
-
"\n",
|
| 395 |
-
"print(model.summary())"
|
| 396 |
-
]
|
| 397 |
-
},
|
| 398 |
-
{
|
| 399 |
-
"cell_type": "markdown",
|
| 400 |
-
"metadata": {},
|
| 401 |
-
"source": [
|
| 402 |
-
"### Loading our Monkey Breed Dataset"
|
| 403 |
-
]
|
| 404 |
-
},
|
| 405 |
-
{
|
| 406 |
-
"cell_type": "code",
|
| 407 |
-
"execution_count": 6,
|
| 408 |
-
"metadata": {},
|
| 409 |
-
"outputs": [
|
| 410 |
-
{
|
| 411 |
-
"name": "stdout",
|
| 412 |
-
"output_type": "stream",
|
| 413 |
-
"text": [
|
| 414 |
-
"Found 1098 images belonging to 10 classes.\n",
|
| 415 |
-
"Found 272 images belonging to 10 classes.\n"
|
| 416 |
-
]
|
| 417 |
-
}
|
| 418 |
-
],
|
| 419 |
-
"source": [
|
| 420 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 421 |
-
"\n",
|
| 422 |
-
"train_data_dir = './monkey_breed/train'\n",
|
| 423 |
-
"validation_data_dir = './monkey_breed/validation'\n",
|
| 424 |
-
"\n",
|
| 425 |
-
"# Let's use some data augmentaiton \n",
|
| 426 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 427 |
-
" rescale=1./255,\n",
|
| 428 |
-
" rotation_range=45,\n",
|
| 429 |
-
" width_shift_range=0.3,\n",
|
| 430 |
-
" height_shift_range=0.3,\n",
|
| 431 |
-
" horizontal_flip=True,\n",
|
| 432 |
-
" fill_mode='nearest')\n",
|
| 433 |
-
" \n",
|
| 434 |
-
"validation_datagen = ImageDataGenerator(rescale=1./255)\n",
|
| 435 |
-
" \n",
|
| 436 |
-
"# set our batch size (typically on most mid tier systems we'll use 16-32)\n",
|
| 437 |
-
"batch_size = 32\n",
|
| 438 |
-
" \n",
|
| 439 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 440 |
-
" train_data_dir,\n",
|
| 441 |
-
" target_size=(img_rows, img_cols),\n",
|
| 442 |
-
" batch_size=batch_size,\n",
|
| 443 |
-
" class_mode='categorical')\n",
|
| 444 |
-
" \n",
|
| 445 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 446 |
-
" validation_data_dir,\n",
|
| 447 |
-
" target_size=(img_rows, img_cols),\n",
|
| 448 |
-
" batch_size=batch_size,\n",
|
| 449 |
-
" class_mode='categorical')"
|
| 450 |
-
]
|
| 451 |
-
},
|
| 452 |
-
{
|
| 453 |
-
"cell_type": "markdown",
|
| 454 |
-
"metadata": {},
|
| 455 |
-
"source": [
|
| 456 |
-
"### Training out Model\n",
|
| 457 |
-
"- Note we're using checkpointing and early stopping"
|
| 458 |
-
]
|
| 459 |
-
},
|
| 460 |
-
{
|
| 461 |
-
"cell_type": "code",
|
| 462 |
-
"execution_count": null,
|
| 463 |
-
"metadata": {},
|
| 464 |
-
"outputs": [],
|
| 465 |
-
"source": [
|
| 466 |
-
"from tensorflow.keras.optimizers import RMSprop\n",
|
| 467 |
-
"from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping\n",
|
| 468 |
-
"\n",
|
| 469 |
-
" \n",
|
| 470 |
-
"checkpoint = ModelCheckpoint(\"monkey_breed_mobileNet.h5\",\n",
|
| 471 |
-
" monitor=\"val_loss\",\n",
|
| 472 |
-
" mode=\"min\",\n",
|
| 473 |
-
" save_best_only = True,\n",
|
| 474 |
-
" verbose=1)\n",
|
| 475 |
-
"\n",
|
| 476 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', \n",
|
| 477 |
-
" min_delta = 0, \n",
|
| 478 |
-
" patience = 3,\n",
|
| 479 |
-
" verbose = 1,\n",
|
| 480 |
-
" restore_best_weights = True)\n",
|
| 481 |
-
"\n",
|
| 482 |
-
"# we put our call backs into a callback list\n",
|
| 483 |
-
"callbacks = [earlystop, checkpoint]\n",
|
| 484 |
-
"\n",
|
| 485 |
-
"# We use a very small learning rate \n",
|
| 486 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 487 |
-
" optimizer = RMSprop(lr = 0.001),\n",
|
| 488 |
-
" metrics = ['accuracy'])\n",
|
| 489 |
-
"\n",
|
| 490 |
-
"# Enter the number of training and validation samples here\n",
|
| 491 |
-
"nb_train_samples = 1097\n",
|
| 492 |
-
"nb_validation_samples = 272\n",
|
| 493 |
-
"\n",
|
| 494 |
-
"# We only train 5 EPOCHS \n",
|
| 495 |
-
"epochs = 1\n",
|
| 496 |
-
"batch_size = 16\n",
|
| 497 |
-
"\n",
|
| 498 |
-
"history = model.fit_generator(\n",
|
| 499 |
-
" train_generator,\n",
|
| 500 |
-
" steps_per_epoch = nb_train_samples // batch_size,\n",
|
| 501 |
-
" epochs = epochs,\n",
|
| 502 |
-
" callbacks = callbacks,\n",
|
| 503 |
-
" validation_data = validation_generator,\n",
|
| 504 |
-
" validation_steps = nb_validation_samples // batch_size)"
|
| 505 |
-
]
|
| 506 |
-
},
|
| 507 |
-
{
|
| 508 |
-
"cell_type": "markdown",
|
| 509 |
-
"metadata": {},
|
| 510 |
-
"source": [
|
| 511 |
-
"### Loading our classifer\n"
|
| 512 |
-
]
|
| 513 |
-
},
|
| 514 |
-
{
|
| 515 |
-
"cell_type": "code",
|
| 516 |
-
"execution_count": 10,
|
| 517 |
-
"metadata": {},
|
| 518 |
-
"outputs": [
|
| 519 |
-
{
|
| 520 |
-
"ename": "OSError",
|
| 521 |
-
"evalue": "SavedModel file does not exist at: monkey_breed_mobileNet.h5/{saved_model.pbtxt|saved_model.pb}",
|
| 522 |
-
"output_type": "error",
|
| 523 |
-
"traceback": [
|
| 524 |
-
"\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
|
| 525 |
-
"\u001b[1;31mOSError\u001b[0m Traceback (most recent call last)",
|
| 526 |
-
"\u001b[1;32m<ipython-input-10-071d0eb65c49>\u001b[0m in \u001b[0;36m<module>\u001b[1;34m\u001b[0m\n\u001b[0;32m 1\u001b[0m \u001b[1;32mfrom\u001b[0m \u001b[0mtensorflow\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mkeras\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mmodels\u001b[0m \u001b[1;32mimport\u001b[0m \u001b[0mload_model\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 2\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 3\u001b[1;33m \u001b[0mclassifier\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mload_model\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;34m'monkey_breed_mobileNet.h5'\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m",
|
| 527 |
-
"\u001b[1;32mC:\\ProgramData\\Anaconda3\\envs\\cv\\lib\\site-packages\\tensorflow_core\\python\\keras\\saving\\save.py\u001b[0m in \u001b[0;36mload_model\u001b[1;34m(filepath, custom_objects, compile)\u001b[0m\n\u001b[0;32m 147\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 148\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0misinstance\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mfilepath\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0msix\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mstring_types\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m--> 149\u001b[1;33m \u001b[0mloader_impl\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mparse_saved_model\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mfilepath\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 150\u001b[0m \u001b[1;32mreturn\u001b[0m \u001b[0msaved_model_load\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mload\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mfilepath\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mcompile\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 151\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n",
|
| 528 |
-
"\u001b[1;32mC:\\ProgramData\\Anaconda3\\envs\\cv\\lib\\site-packages\\tensorflow_core\\python\\saved_model\\loader_impl.py\u001b[0m in \u001b[0;36mparse_saved_model\u001b[1;34m(export_dir)\u001b[0m\n\u001b[0;32m 81\u001b[0m (export_dir,\n\u001b[0;32m 82\u001b[0m \u001b[0mconstants\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mSAVED_MODEL_FILENAME_PBTXT\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m---> 83\u001b[1;33m constants.SAVED_MODEL_FILENAME_PB))\n\u001b[0m\u001b[0;32m 84\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 85\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n",
|
| 529 |
-
"\u001b[1;31mOSError\u001b[0m: SavedModel file does not exist at: monkey_breed_mobileNet.h5/{saved_model.pbtxt|saved_model.pb}"
|
| 530 |
-
]
|
| 531 |
-
}
|
| 532 |
-
],
|
| 533 |
-
"source": [
|
| 534 |
-
"from tensorflow.keras.models import load_model\n",
|
| 535 |
-
"\n",
|
| 536 |
-
"classifier = load_model('monkey_breed_mobileNet.h5')"
|
| 537 |
-
]
|
| 538 |
-
},
|
| 539 |
-
{
|
| 540 |
-
"cell_type": "markdown",
|
| 541 |
-
"metadata": {},
|
| 542 |
-
"source": [
|
| 543 |
-
"### Testing our classifer on some test images"
|
| 544 |
-
]
|
| 545 |
-
},
|
| 546 |
-
{
|
| 547 |
-
"cell_type": "code",
|
| 548 |
-
"execution_count": 16,
|
| 549 |
-
"metadata": {},
|
| 550 |
-
"outputs": [
|
| 551 |
-
{
|
| 552 |
-
"name": "stdout",
|
| 553 |
-
"output_type": "stream",
|
| 554 |
-
"text": [
|
| 555 |
-
"Class - mantled_howler \n",
|
| 556 |
-
"Class - patas_monkey\n",
|
| 557 |
-
"Class - patas_monkey\n",
|
| 558 |
-
"Class - silvery_marmoset\n",
|
| 559 |
-
"Class - black_headed_night_monkey\n",
|
| 560 |
-
"Class - pygmy_marmoset \n",
|
| 561 |
-
"Class - silvery_marmoset\n",
|
| 562 |
-
"Class - mantled_howler \n",
|
| 563 |
-
"Class - common_squirrel_monkey\n",
|
| 564 |
-
"Class - patas_monkey\n"
|
| 565 |
-
]
|
| 566 |
-
}
|
| 567 |
-
],
|
| 568 |
-
"source": [
|
| 569 |
-
"import os\n",
|
| 570 |
-
"import cv2\n",
|
| 571 |
-
"import numpy as np\n",
|
| 572 |
-
"from os import listdir\n",
|
| 573 |
-
"from os.path import isfile, join\n",
|
| 574 |
-
"\n",
|
| 575 |
-
"monkey_breeds_dict = {\"[0]\": \"mantled_howler \", \n",
|
| 576 |
-
" \"[1]\": \"patas_monkey\",\n",
|
| 577 |
-
" \"[2]\": \"bald_uakari\",\n",
|
| 578 |
-
" \"[3]\": \"japanese_macaque\",\n",
|
| 579 |
-
" \"[4]\": \"pygmy_marmoset \",\n",
|
| 580 |
-
" \"[5]\": \"white_headed_capuchin\",\n",
|
| 581 |
-
" \"[6]\": \"silvery_marmoset\",\n",
|
| 582 |
-
" \"[7]\": \"common_squirrel_monkey\",\n",
|
| 583 |
-
" \"[8]\": \"black_headed_night_monkey\",\n",
|
| 584 |
-
" \"[9]\": \"nilgiri_langur\"}\n",
|
| 585 |
-
"\n",
|
| 586 |
-
"monkey_breeds_dict_n = {\"n0\": \"mantled_howler \", \n",
|
| 587 |
-
" \"n1\": \"patas_monkey\",\n",
|
| 588 |
-
" \"n2\": \"bald_uakari\",\n",
|
| 589 |
-
" \"n3\": \"japanese_macaque\",\n",
|
| 590 |
-
" \"n4\": \"pygmy_marmoset \",\n",
|
| 591 |
-
" \"n5\": \"white_headed_capuchin\",\n",
|
| 592 |
-
" \"n6\": \"silvery_marmoset\",\n",
|
| 593 |
-
" \"n7\": \"common_squirrel_monkey\",\n",
|
| 594 |
-
" \"n8\": \"black_headed_night_monkey\",\n",
|
| 595 |
-
" \"n9\": \"nilgiri_langur\"}\n",
|
| 596 |
-
"\n",
|
| 597 |
-
"def draw_test(name, pred, im):\n",
|
| 598 |
-
" monkey = monkey_breeds_dict[str(pred)]\n",
|
| 599 |
-
" BLACK = [0,0,0]\n",
|
| 600 |
-
" expanded_image = cv2.copyMakeBorder(im, 80, 0, 0, 100 ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 601 |
-
" cv2.putText(expanded_image, monkey, (20, 60) , cv2.FONT_HERSHEY_SIMPLEX,1, (0,0,255), 2)\n",
|
| 602 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 603 |
-
"\n",
|
| 604 |
-
"def getRandomImage(path):\n",
|
| 605 |
-
" \"\"\"function loads a random images from a random folder in our test path \"\"\"\n",
|
| 606 |
-
" folders = list(filter(lambda x: os.path.isdir(os.path.join(path, x)), os.listdir(path)))\n",
|
| 607 |
-
" random_directory = np.random.randint(0,len(folders))\n",
|
| 608 |
-
" path_class = folders[random_directory]\n",
|
| 609 |
-
" print(\"Class - \" + monkey_breeds_dict_n[str(path_class)])\n",
|
| 610 |
-
" file_path = path + path_class\n",
|
| 611 |
-
" file_names = [f for f in listdir(file_path) if isfile(join(file_path, f))]\n",
|
| 612 |
-
" random_file_index = np.random.randint(0,len(file_names))\n",
|
| 613 |
-
" image_name = file_names[random_file_index]\n",
|
| 614 |
-
" return cv2.imread(file_path+\"/\"+image_name) \n",
|
| 615 |
-
"\n",
|
| 616 |
-
"for i in range(0,10):\n",
|
| 617 |
-
" input_im = getRandomImage(\"./monkey_breed/validation/\")\n",
|
| 618 |
-
" input_original = input_im.copy()\n",
|
| 619 |
-
" input_original = cv2.resize(input_original, None, fx=0.5, fy=0.5, interpolation = cv2.INTER_LINEAR)\n",
|
| 620 |
-
" \n",
|
| 621 |
-
" input_im = cv2.resize(input_im, (224, 224), interpolation = cv2.INTER_LINEAR)\n",
|
| 622 |
-
" input_im = input_im / 255.\n",
|
| 623 |
-
" input_im = input_im.reshape(1,224,224,3) \n",
|
| 624 |
-
" \n",
|
| 625 |
-
" # Get Prediction\n",
|
| 626 |
-
" res = np.argmax(classifier.predict(input_im, 1, verbose = 0), axis=1)\n",
|
| 627 |
-
" \n",
|
| 628 |
-
" # Show image with predicted class\n",
|
| 629 |
-
" draw_test(\"Prediction\", res, input_original) \n",
|
| 630 |
-
" cv2.waitKey(0)\n",
|
| 631 |
-
"\n",
|
| 632 |
-
"cv2.destroyAllWindows()"
|
| 633 |
-
]
|
| 634 |
-
}
|
| 635 |
-
],
|
| 636 |
-
"metadata": {
|
| 637 |
-
"kernelspec": {
|
| 638 |
-
"display_name": "Python 3",
|
| 639 |
-
"language": "python",
|
| 640 |
-
"name": "python3"
|
| 641 |
-
},
|
| 642 |
-
"language_info": {
|
| 643 |
-
"codemirror_mode": {
|
| 644 |
-
"name": "ipython",
|
| 645 |
-
"version": 3
|
| 646 |
-
},
|
| 647 |
-
"file_extension": ".py",
|
| 648 |
-
"mimetype": "text/x-python",
|
| 649 |
-
"name": "python",
|
| 650 |
-
"nbconvert_exporter": "python",
|
| 651 |
-
"pygments_lexer": "ipython3",
|
| 652 |
-
"version": "3.7.4"
|
| 653 |
-
}
|
| 654 |
-
},
|
| 655 |
-
"nbformat": 4,
|
| 656 |
-
"nbformat_minor": 2
|
| 657 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:7cfba96836be5c73270dc3440eb23b49d7abac26a1c67b8f79fc7d86417032b7
|
| 3 |
+
size 32387
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
15. Transfer Learning & Fine Tuning/15.3 Making a Flower Classifier with VGG16.ipynb
CHANGED
|
@@ -1,693 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Making a Flower Classifier with VGG16\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"### Loading the VGG16 Model"
|
| 10 |
-
]
|
| 11 |
-
},
|
| 12 |
-
{
|
| 13 |
-
"cell_type": "code",
|
| 14 |
-
"execution_count": 2,
|
| 15 |
-
"metadata": {},
|
| 16 |
-
"outputs": [],
|
| 17 |
-
"source": [
|
| 18 |
-
"from tensorflow.keras.applications import VGG16\n",
|
| 19 |
-
"\n",
|
| 20 |
-
"# VGG16 was designed to work on 224 x 224 pixel input images sizes\n",
|
| 21 |
-
"img_rows = 224\n",
|
| 22 |
-
"img_cols = 224 \n",
|
| 23 |
-
"\n",
|
| 24 |
-
"#Loads the VGG16 model \n",
|
| 25 |
-
"vgg16 = VGG16(weights = 'imagenet', \n",
|
| 26 |
-
" include_top = False, \n",
|
| 27 |
-
" input_shape = (img_rows, img_cols, 3))"
|
| 28 |
-
]
|
| 29 |
-
},
|
| 30 |
-
{
|
| 31 |
-
"cell_type": "markdown",
|
| 32 |
-
"metadata": {},
|
| 33 |
-
"source": [
|
| 34 |
-
"### Inpsecting each layer"
|
| 35 |
-
]
|
| 36 |
-
},
|
| 37 |
-
{
|
| 38 |
-
"cell_type": "code",
|
| 39 |
-
"execution_count": 3,
|
| 40 |
-
"metadata": {},
|
| 41 |
-
"outputs": [
|
| 42 |
-
{
|
| 43 |
-
"name": "stdout",
|
| 44 |
-
"output_type": "stream",
|
| 45 |
-
"text": [
|
| 46 |
-
"0 InputLayer True\n",
|
| 47 |
-
"1 Conv2D True\n",
|
| 48 |
-
"2 Conv2D True\n",
|
| 49 |
-
"3 MaxPooling2D True\n",
|
| 50 |
-
"4 Conv2D True\n",
|
| 51 |
-
"5 Conv2D True\n",
|
| 52 |
-
"6 MaxPooling2D True\n",
|
| 53 |
-
"7 Conv2D True\n",
|
| 54 |
-
"8 Conv2D True\n",
|
| 55 |
-
"9 Conv2D True\n",
|
| 56 |
-
"10 MaxPooling2D True\n",
|
| 57 |
-
"11 Conv2D True\n",
|
| 58 |
-
"12 Conv2D True\n",
|
| 59 |
-
"13 Conv2D True\n",
|
| 60 |
-
"14 MaxPooling2D True\n",
|
| 61 |
-
"15 Conv2D True\n",
|
| 62 |
-
"16 Conv2D True\n",
|
| 63 |
-
"17 Conv2D True\n",
|
| 64 |
-
"18 MaxPooling2D True\n"
|
| 65 |
-
]
|
| 66 |
-
}
|
| 67 |
-
],
|
| 68 |
-
"source": [
|
| 69 |
-
"# Let's print our layers \n",
|
| 70 |
-
"for (i,layer) in enumerate(vgg16.layers):\n",
|
| 71 |
-
" print(str(i) + \" \"+ layer.__class__.__name__, layer.trainable)"
|
| 72 |
-
]
|
| 73 |
-
},
|
| 74 |
-
{
|
| 75 |
-
"cell_type": "markdown",
|
| 76 |
-
"metadata": {},
|
| 77 |
-
"source": [
|
| 78 |
-
"### Let's freeze all layers except the top 4 "
|
| 79 |
-
]
|
| 80 |
-
},
|
| 81 |
-
{
|
| 82 |
-
"cell_type": "code",
|
| 83 |
-
"execution_count": 4,
|
| 84 |
-
"metadata": {},
|
| 85 |
-
"outputs": [
|
| 86 |
-
{
|
| 87 |
-
"name": "stdout",
|
| 88 |
-
"output_type": "stream",
|
| 89 |
-
"text": [
|
| 90 |
-
"0 InputLayer False\n",
|
| 91 |
-
"1 Conv2D False\n",
|
| 92 |
-
"2 Conv2D False\n",
|
| 93 |
-
"3 MaxPooling2D False\n",
|
| 94 |
-
"4 Conv2D False\n",
|
| 95 |
-
"5 Conv2D False\n",
|
| 96 |
-
"6 MaxPooling2D False\n",
|
| 97 |
-
"7 Conv2D False\n",
|
| 98 |
-
"8 Conv2D False\n",
|
| 99 |
-
"9 Conv2D False\n",
|
| 100 |
-
"10 MaxPooling2D False\n",
|
| 101 |
-
"11 Conv2D False\n",
|
| 102 |
-
"12 Conv2D False\n",
|
| 103 |
-
"13 Conv2D False\n",
|
| 104 |
-
"14 MaxPooling2D False\n",
|
| 105 |
-
"15 Conv2D False\n",
|
| 106 |
-
"16 Conv2D False\n",
|
| 107 |
-
"17 Conv2D False\n",
|
| 108 |
-
"18 MaxPooling2D False\n"
|
| 109 |
-
]
|
| 110 |
-
}
|
| 111 |
-
],
|
| 112 |
-
"source": [
|
| 113 |
-
"from tensorflow.keras.applications import VGG16\n",
|
| 114 |
-
"\n",
|
| 115 |
-
"# VGG16 was designed to work on 224 x 224 pixel input images sizes\n",
|
| 116 |
-
"img_rows = 224\n",
|
| 117 |
-
"img_cols = 224 \n",
|
| 118 |
-
"\n",
|
| 119 |
-
"# Re-loads the VGG16 model without the top or FC layers\n",
|
| 120 |
-
"vgg16 = VGG16(weights = 'imagenet', \n",
|
| 121 |
-
" include_top = False, \n",
|
| 122 |
-
" input_shape = (img_rows, img_cols, 3))\n",
|
| 123 |
-
"\n",
|
| 124 |
-
"# Here we freeze the last 4 layers \n",
|
| 125 |
-
"# Layers are set to trainable as True by default\n",
|
| 126 |
-
"for layer in vgg16.layers:\n",
|
| 127 |
-
" layer.trainable = False\n",
|
| 128 |
-
" \n",
|
| 129 |
-
"# Let's print our layers \n",
|
| 130 |
-
"for (i,layer) in enumerate(vgg16.layers):\n",
|
| 131 |
-
" print(str(i) + \" \"+ layer.__class__.__name__, layer.trainable)"
|
| 132 |
-
]
|
| 133 |
-
},
|
| 134 |
-
{
|
| 135 |
-
"cell_type": "markdown",
|
| 136 |
-
"metadata": {},
|
| 137 |
-
"source": [
|
| 138 |
-
"### Let's make a function that returns our FC Head"
|
| 139 |
-
]
|
| 140 |
-
},
|
| 141 |
-
{
|
| 142 |
-
"cell_type": "code",
|
| 143 |
-
"execution_count": 7,
|
| 144 |
-
"metadata": {},
|
| 145 |
-
"outputs": [],
|
| 146 |
-
"source": [
|
| 147 |
-
"def addTopModel(bottom_model, num_classes, D=256):\n",
|
| 148 |
-
" \"\"\"creates the top or head of the model that will be \n",
|
| 149 |
-
" placed ontop of the bottom layers\"\"\"\n",
|
| 150 |
-
" top_model = bottom_model.output\n",
|
| 151 |
-
" top_model = Flatten(name = \"flatten\")(top_model)\n",
|
| 152 |
-
" top_model = Dense(D, activation = \"relu\")(top_model)\n",
|
| 153 |
-
" top_model = Dropout(0.3)(top_model)\n",
|
| 154 |
-
" top_model = Dense(num_classes, activation = \"softmax\")(top_model)\n",
|
| 155 |
-
" return top_model"
|
| 156 |
-
]
|
| 157 |
-
},
|
| 158 |
-
{
|
| 159 |
-
"cell_type": "markdown",
|
| 160 |
-
"metadata": {},
|
| 161 |
-
"source": [
|
| 162 |
-
"### Let's add our FC Head back onto VGG"
|
| 163 |
-
]
|
| 164 |
-
},
|
| 165 |
-
{
|
| 166 |
-
"cell_type": "code",
|
| 167 |
-
"execution_count": 8,
|
| 168 |
-
"metadata": {},
|
| 169 |
-
"outputs": [
|
| 170 |
-
{
|
| 171 |
-
"name": "stdout",
|
| 172 |
-
"output_type": "stream",
|
| 173 |
-
"text": [
|
| 174 |
-
"Model: \"model\"\n",
|
| 175 |
-
"_________________________________________________________________\n",
|
| 176 |
-
"Layer (type) Output Shape Param # \n",
|
| 177 |
-
"=================================================================\n",
|
| 178 |
-
"input_3 (InputLayer) [(None, 224, 224, 3)] 0 \n",
|
| 179 |
-
"_________________________________________________________________\n",
|
| 180 |
-
"block1_conv1 (Conv2D) (None, 224, 224, 64) 1792 \n",
|
| 181 |
-
"_________________________________________________________________\n",
|
| 182 |
-
"block1_conv2 (Conv2D) (None, 224, 224, 64) 36928 \n",
|
| 183 |
-
"_________________________________________________________________\n",
|
| 184 |
-
"block1_pool (MaxPooling2D) (None, 112, 112, 64) 0 \n",
|
| 185 |
-
"_________________________________________________________________\n",
|
| 186 |
-
"block2_conv1 (Conv2D) (None, 112, 112, 128) 73856 \n",
|
| 187 |
-
"_________________________________________________________________\n",
|
| 188 |
-
"block2_conv2 (Conv2D) (None, 112, 112, 128) 147584 \n",
|
| 189 |
-
"_________________________________________________________________\n",
|
| 190 |
-
"block2_pool (MaxPooling2D) (None, 56, 56, 128) 0 \n",
|
| 191 |
-
"_________________________________________________________________\n",
|
| 192 |
-
"block3_conv1 (Conv2D) (None, 56, 56, 256) 295168 \n",
|
| 193 |
-
"_________________________________________________________________\n",
|
| 194 |
-
"block3_conv2 (Conv2D) (None, 56, 56, 256) 590080 \n",
|
| 195 |
-
"_________________________________________________________________\n",
|
| 196 |
-
"block3_conv3 (Conv2D) (None, 56, 56, 256) 590080 \n",
|
| 197 |
-
"_________________________________________________________________\n",
|
| 198 |
-
"block3_pool (MaxPooling2D) (None, 28, 28, 256) 0 \n",
|
| 199 |
-
"_________________________________________________________________\n",
|
| 200 |
-
"block4_conv1 (Conv2D) (None, 28, 28, 512) 1180160 \n",
|
| 201 |
-
"_________________________________________________________________\n",
|
| 202 |
-
"block4_conv2 (Conv2D) (None, 28, 28, 512) 2359808 \n",
|
| 203 |
-
"_________________________________________________________________\n",
|
| 204 |
-
"block4_conv3 (Conv2D) (None, 28, 28, 512) 2359808 \n",
|
| 205 |
-
"_________________________________________________________________\n",
|
| 206 |
-
"block4_pool (MaxPooling2D) (None, 14, 14, 512) 0 \n",
|
| 207 |
-
"_________________________________________________________________\n",
|
| 208 |
-
"block5_conv1 (Conv2D) (None, 14, 14, 512) 2359808 \n",
|
| 209 |
-
"_________________________________________________________________\n",
|
| 210 |
-
"block5_conv2 (Conv2D) (None, 14, 14, 512) 2359808 \n",
|
| 211 |
-
"_________________________________________________________________\n",
|
| 212 |
-
"block5_conv3 (Conv2D) (None, 14, 14, 512) 2359808 \n",
|
| 213 |
-
"_________________________________________________________________\n",
|
| 214 |
-
"block5_pool (MaxPooling2D) (None, 7, 7, 512) 0 \n",
|
| 215 |
-
"_________________________________________________________________\n",
|
| 216 |
-
"flatten (Flatten) (None, 25088) 0 \n",
|
| 217 |
-
"_________________________________________________________________\n",
|
| 218 |
-
"dense (Dense) (None, 256) 6422784 \n",
|
| 219 |
-
"_________________________________________________________________\n",
|
| 220 |
-
"dropout (Dropout) (None, 256) 0 \n",
|
| 221 |
-
"_________________________________________________________________\n",
|
| 222 |
-
"dense_1 (Dense) (None, 17) 4369 \n",
|
| 223 |
-
"=================================================================\n",
|
| 224 |
-
"Total params: 21,141,841\n",
|
| 225 |
-
"Trainable params: 6,427,153\n",
|
| 226 |
-
"Non-trainable params: 14,714,688\n",
|
| 227 |
-
"_________________________________________________________________\n",
|
| 228 |
-
"None\n"
|
| 229 |
-
]
|
| 230 |
-
}
|
| 231 |
-
],
|
| 232 |
-
"source": [
|
| 233 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 234 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten\n",
|
| 235 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D, ZeroPadding2D\n",
|
| 236 |
-
"from tensorflow.keras.layers import BatchNormalization\n",
|
| 237 |
-
"from tensorflow.keras.models import Model\n",
|
| 238 |
-
"\n",
|
| 239 |
-
"num_classes = 17\n",
|
| 240 |
-
"\n",
|
| 241 |
-
"FC_Head = addTopModel(vgg16, num_classes)\n",
|
| 242 |
-
"\n",
|
| 243 |
-
"model = Model(inputs=vgg16.input, outputs=FC_Head)\n",
|
| 244 |
-
"\n",
|
| 245 |
-
"print(model.summary())"
|
| 246 |
-
]
|
| 247 |
-
},
|
| 248 |
-
{
|
| 249 |
-
"cell_type": "markdown",
|
| 250 |
-
"metadata": {},
|
| 251 |
-
"source": [
|
| 252 |
-
"### Loading our Flowers Dataset"
|
| 253 |
-
]
|
| 254 |
-
},
|
| 255 |
-
{
|
| 256 |
-
"cell_type": "code",
|
| 257 |
-
"execution_count": 9,
|
| 258 |
-
"metadata": {},
|
| 259 |
-
"outputs": [
|
| 260 |
-
{
|
| 261 |
-
"name": "stdout",
|
| 262 |
-
"output_type": "stream",
|
| 263 |
-
"text": [
|
| 264 |
-
"Found 1190 images belonging to 17 classes.\n",
|
| 265 |
-
"Found 170 images belonging to 17 classes.\n"
|
| 266 |
-
]
|
| 267 |
-
}
|
| 268 |
-
],
|
| 269 |
-
"source": [
|
| 270 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 271 |
-
"\n",
|
| 272 |
-
"train_data_dir = './17_flowers/train'\n",
|
| 273 |
-
"validation_data_dir = './17_flowers/validation'\n",
|
| 274 |
-
"\n",
|
| 275 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 276 |
-
" rescale=1./255,\n",
|
| 277 |
-
" rotation_range=20,\n",
|
| 278 |
-
" width_shift_range=0.2,\n",
|
| 279 |
-
" height_shift_range=0.2,\n",
|
| 280 |
-
" horizontal_flip=True,\n",
|
| 281 |
-
" fill_mode='nearest')\n",
|
| 282 |
-
" \n",
|
| 283 |
-
"validation_datagen = ImageDataGenerator(rescale=1./255)\n",
|
| 284 |
-
" \n",
|
| 285 |
-
"# Change the batchsize according to your system RAM\n",
|
| 286 |
-
"train_batchsize = 16\n",
|
| 287 |
-
"val_batchsize = 10\n",
|
| 288 |
-
" \n",
|
| 289 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 290 |
-
" train_data_dir,\n",
|
| 291 |
-
" target_size=(img_rows, img_cols),\n",
|
| 292 |
-
" batch_size=train_batchsize,\n",
|
| 293 |
-
" class_mode='categorical')\n",
|
| 294 |
-
" \n",
|
| 295 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 296 |
-
" validation_data_dir,\n",
|
| 297 |
-
" target_size=(img_rows, img_cols),\n",
|
| 298 |
-
" batch_size=val_batchsize,\n",
|
| 299 |
-
" class_mode='categorical',\n",
|
| 300 |
-
" shuffle=False)"
|
| 301 |
-
]
|
| 302 |
-
},
|
| 303 |
-
{
|
| 304 |
-
"cell_type": "markdown",
|
| 305 |
-
"metadata": {},
|
| 306 |
-
"source": [
|
| 307 |
-
"### Training our top layers"
|
| 308 |
-
]
|
| 309 |
-
},
|
| 310 |
-
{
|
| 311 |
-
"cell_type": "code",
|
| 312 |
-
"execution_count": 11,
|
| 313 |
-
"metadata": {},
|
| 314 |
-
"outputs": [
|
| 315 |
-
{
|
| 316 |
-
"name": "stdout",
|
| 317 |
-
"output_type": "stream",
|
| 318 |
-
"text": [
|
| 319 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 320 |
-
" ...\n",
|
| 321 |
-
" to \n",
|
| 322 |
-
" ['...']\n",
|
| 323 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 324 |
-
" ...\n",
|
| 325 |
-
" to \n",
|
| 326 |
-
" ['...']\n",
|
| 327 |
-
"Train for 74 steps, validate for 10 steps\n",
|
| 328 |
-
"73/74 [============================>.] - ETA: 4s - loss: 3.7274 - accuracy: 0.2366\n",
|
| 329 |
-
"Epoch 00001: val_loss improved from inf to 1.32238, saving model to flowers_vgg.h5\n",
|
| 330 |
-
"74/74 [==============================] - 340s 5s/step - loss: 3.7036 - accuracy: 0.2394 - val_loss: 1.3224 - val_accuracy: 0.5900\n"
|
| 331 |
-
]
|
| 332 |
-
}
|
| 333 |
-
],
|
| 334 |
-
"source": [
|
| 335 |
-
"from tensorflow.keras.optimizers import RMSprop\n",
|
| 336 |
-
"from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping\n",
|
| 337 |
-
" \n",
|
| 338 |
-
"checkpoint = ModelCheckpoint(\"flowers_vgg.h5\",\n",
|
| 339 |
-
" monitor=\"val_loss\",\n",
|
| 340 |
-
" mode=\"min\",\n",
|
| 341 |
-
" save_best_only = True,\n",
|
| 342 |
-
" verbose=1)\n",
|
| 343 |
-
"\n",
|
| 344 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', \n",
|
| 345 |
-
" min_delta = 0, \n",
|
| 346 |
-
" patience = 3,\n",
|
| 347 |
-
" verbose = 1,\n",
|
| 348 |
-
" restore_best_weights = True)\n",
|
| 349 |
-
"\n",
|
| 350 |
-
"# we put our call backs into a callback list\n",
|
| 351 |
-
"callbacks = [earlystop, checkpoint]\n",
|
| 352 |
-
"\n",
|
| 353 |
-
"# Note we use a very small learning rate \n",
|
| 354 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 355 |
-
" optimizer = RMSprop(lr = 0.001),\n",
|
| 356 |
-
" metrics = ['accuracy'])\n",
|
| 357 |
-
"\n",
|
| 358 |
-
"nb_train_samples = 1190\n",
|
| 359 |
-
"nb_validation_samples = 170\n",
|
| 360 |
-
"epochs = 5\n",
|
| 361 |
-
"batch_size = 16\n",
|
| 362 |
-
"\n",
|
| 363 |
-
"history = model.fit_generator(\n",
|
| 364 |
-
" train_generator,\n",
|
| 365 |
-
" steps_per_epoch = nb_train_samples // batch_size,\n",
|
| 366 |
-
" epochs = epochs,\n",
|
| 367 |
-
" callbacks = callbacks,\n",
|
| 368 |
-
" validation_data = validation_generator,\n",
|
| 369 |
-
" validation_steps = nb_validation_samples // batch_size)\n",
|
| 370 |
-
"\n",
|
| 371 |
-
"model.save(\"flowers_vgg.h5\")"
|
| 372 |
-
]
|
| 373 |
-
},
|
| 374 |
-
{
|
| 375 |
-
"cell_type": "markdown",
|
| 376 |
-
"metadata": {},
|
| 377 |
-
"source": [
|
| 378 |
-
"## Can we speed this up?\n",
|
| 379 |
-
"#### Let's try re-sizing the image to 64 x 64"
|
| 380 |
-
]
|
| 381 |
-
},
|
| 382 |
-
{
|
| 383 |
-
"cell_type": "code",
|
| 384 |
-
"execution_count": 12,
|
| 385 |
-
"metadata": {},
|
| 386 |
-
"outputs": [
|
| 387 |
-
{
|
| 388 |
-
"name": "stdout",
|
| 389 |
-
"output_type": "stream",
|
| 390 |
-
"text": [
|
| 391 |
-
"0 InputLayer False\n",
|
| 392 |
-
"1 Conv2D False\n",
|
| 393 |
-
"2 Conv2D False\n",
|
| 394 |
-
"3 MaxPooling2D False\n",
|
| 395 |
-
"4 Conv2D False\n",
|
| 396 |
-
"5 Conv2D False\n",
|
| 397 |
-
"6 MaxPooling2D False\n",
|
| 398 |
-
"7 Conv2D False\n",
|
| 399 |
-
"8 Conv2D False\n",
|
| 400 |
-
"9 Conv2D False\n",
|
| 401 |
-
"10 MaxPooling2D False\n",
|
| 402 |
-
"11 Conv2D False\n",
|
| 403 |
-
"12 Conv2D False\n",
|
| 404 |
-
"13 Conv2D False\n",
|
| 405 |
-
"14 MaxPooling2D False\n",
|
| 406 |
-
"15 Conv2D False\n",
|
| 407 |
-
"16 Conv2D False\n",
|
| 408 |
-
"17 Conv2D False\n",
|
| 409 |
-
"18 MaxPooling2D False\n"
|
| 410 |
-
]
|
| 411 |
-
}
|
| 412 |
-
],
|
| 413 |
-
"source": [
|
| 414 |
-
"from tensorflow.keras.applications import VGG16\n",
|
| 415 |
-
"\n",
|
| 416 |
-
"# Setting the input size now to 64 x 64 pixel \n",
|
| 417 |
-
"img_rows = 64\n",
|
| 418 |
-
"img_cols = 64 \n",
|
| 419 |
-
"\n",
|
| 420 |
-
"# Re-loads the VGG16 model without the top or FC layers\n",
|
| 421 |
-
"vgg16 = VGG16(weights = 'imagenet', \n",
|
| 422 |
-
" include_top = False, \n",
|
| 423 |
-
" input_shape = (img_rows, img_cols, 3))\n",
|
| 424 |
-
"\n",
|
| 425 |
-
"# Here we freeze the last 4 layers \n",
|
| 426 |
-
"# Layers are set to trainable as True by default\n",
|
| 427 |
-
"for layer in vgg16.layers:\n",
|
| 428 |
-
" layer.trainable = False\n",
|
| 429 |
-
" \n",
|
| 430 |
-
"# Let's print our layers \n",
|
| 431 |
-
"for (i,layer) in enumerate(vgg16.layers):\n",
|
| 432 |
-
" print(str(i) + \" \"+ layer.__class__.__name__, layer.trainable)"
|
| 433 |
-
]
|
| 434 |
-
},
|
| 435 |
-
{
|
| 436 |
-
"cell_type": "markdown",
|
| 437 |
-
"metadata": {},
|
| 438 |
-
"source": [
|
| 439 |
-
"### Let's create our new model using an image size of 64 x 64"
|
| 440 |
-
]
|
| 441 |
-
},
|
| 442 |
-
{
|
| 443 |
-
"cell_type": "code",
|
| 444 |
-
"execution_count": 14,
|
| 445 |
-
"metadata": {},
|
| 446 |
-
"outputs": [
|
| 447 |
-
{
|
| 448 |
-
"name": "stdout",
|
| 449 |
-
"output_type": "stream",
|
| 450 |
-
"text": [
|
| 451 |
-
"Found 1190 images belonging to 17 classes.\n",
|
| 452 |
-
"Found 170 images belonging to 17 classes.\n",
|
| 453 |
-
"Model: \"model_1\"\n",
|
| 454 |
-
"_________________________________________________________________\n",
|
| 455 |
-
"Layer (type) Output Shape Param # \n",
|
| 456 |
-
"=================================================================\n",
|
| 457 |
-
"input_5 (InputLayer) [(None, 64, 64, 3)] 0 \n",
|
| 458 |
-
"_________________________________________________________________\n",
|
| 459 |
-
"block1_conv1 (Conv2D) (None, 64, 64, 64) 1792 \n",
|
| 460 |
-
"_________________________________________________________________\n",
|
| 461 |
-
"block1_conv2 (Conv2D) (None, 64, 64, 64) 36928 \n",
|
| 462 |
-
"_________________________________________________________________\n",
|
| 463 |
-
"block1_pool (MaxPooling2D) (None, 32, 32, 64) 0 \n",
|
| 464 |
-
"_________________________________________________________________\n",
|
| 465 |
-
"block2_conv1 (Conv2D) (None, 32, 32, 128) 73856 \n",
|
| 466 |
-
"_________________________________________________________________\n",
|
| 467 |
-
"block2_conv2 (Conv2D) (None, 32, 32, 128) 147584 \n",
|
| 468 |
-
"_________________________________________________________________\n",
|
| 469 |
-
"block2_pool (MaxPooling2D) (None, 16, 16, 128) 0 \n",
|
| 470 |
-
"_________________________________________________________________\n",
|
| 471 |
-
"block3_conv1 (Conv2D) (None, 16, 16, 256) 295168 \n",
|
| 472 |
-
"_________________________________________________________________\n",
|
| 473 |
-
"block3_conv2 (Conv2D) (None, 16, 16, 256) 590080 \n",
|
| 474 |
-
"_________________________________________________________________\n",
|
| 475 |
-
"block3_conv3 (Conv2D) (None, 16, 16, 256) 590080 \n",
|
| 476 |
-
"_________________________________________________________________\n",
|
| 477 |
-
"block3_pool (MaxPooling2D) (None, 8, 8, 256) 0 \n",
|
| 478 |
-
"_________________________________________________________________\n",
|
| 479 |
-
"block4_conv1 (Conv2D) (None, 8, 8, 512) 1180160 \n",
|
| 480 |
-
"_________________________________________________________________\n",
|
| 481 |
-
"block4_conv2 (Conv2D) (None, 8, 8, 512) 2359808 \n",
|
| 482 |
-
"_________________________________________________________________\n",
|
| 483 |
-
"block4_conv3 (Conv2D) (None, 8, 8, 512) 2359808 \n",
|
| 484 |
-
"_________________________________________________________________\n",
|
| 485 |
-
"block4_pool (MaxPooling2D) (None, 4, 4, 512) 0 \n",
|
| 486 |
-
"_________________________________________________________________\n",
|
| 487 |
-
"block5_conv1 (Conv2D) (None, 4, 4, 512) 2359808 \n",
|
| 488 |
-
"_________________________________________________________________\n",
|
| 489 |
-
"block5_conv2 (Conv2D) (None, 4, 4, 512) 2359808 \n",
|
| 490 |
-
"_________________________________________________________________\n",
|
| 491 |
-
"block5_conv3 (Conv2D) (None, 4, 4, 512) 2359808 \n",
|
| 492 |
-
"_________________________________________________________________\n",
|
| 493 |
-
"block5_pool (MaxPooling2D) (None, 2, 2, 512) 0 \n",
|
| 494 |
-
"_________________________________________________________________\n",
|
| 495 |
-
"flatten (Flatten) (None, 2048) 0 \n",
|
| 496 |
-
"_________________________________________________________________\n",
|
| 497 |
-
"dense_2 (Dense) (None, 256) 524544 \n",
|
| 498 |
-
"_________________________________________________________________\n",
|
| 499 |
-
"dropout_1 (Dropout) (None, 256) 0 \n",
|
| 500 |
-
"_________________________________________________________________\n",
|
| 501 |
-
"dense_3 (Dense) (None, 17) 4369 \n",
|
| 502 |
-
"=================================================================\n",
|
| 503 |
-
"Total params: 15,243,601\n",
|
| 504 |
-
"Trainable params: 528,913\n",
|
| 505 |
-
"Non-trainable params: 14,714,688\n",
|
| 506 |
-
"_________________________________________________________________\n",
|
| 507 |
-
"None\n"
|
| 508 |
-
]
|
| 509 |
-
}
|
| 510 |
-
],
|
| 511 |
-
"source": [
|
| 512 |
-
"from tensorflow.keras.applications import VGG16\n",
|
| 513 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 514 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten\n",
|
| 515 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D, ZeroPadding2D\n",
|
| 516 |
-
"from tensorflow.keras.layers import BatchNormalization\n",
|
| 517 |
-
"from tensorflow.keras.models import Model\n",
|
| 518 |
-
"from tensorflow.keras.optimizers import RMSprop\n",
|
| 519 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 520 |
-
"\n",
|
| 521 |
-
"train_data_dir = './17_flowers/train'\n",
|
| 522 |
-
"validation_data_dir = './17_flowers/validation'\n",
|
| 523 |
-
"\n",
|
| 524 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 525 |
-
" rescale=1./255,\n",
|
| 526 |
-
" rotation_range=20,\n",
|
| 527 |
-
" width_shift_range=0.2,\n",
|
| 528 |
-
" height_shift_range=0.2,\n",
|
| 529 |
-
" horizontal_flip=True,\n",
|
| 530 |
-
" fill_mode='nearest')\n",
|
| 531 |
-
" \n",
|
| 532 |
-
"validation_datagen = ImageDataGenerator(rescale=1./255)\n",
|
| 533 |
-
" \n",
|
| 534 |
-
"# Change the batchsize according to your system RAM\n",
|
| 535 |
-
"train_batchsize = 16\n",
|
| 536 |
-
"val_batchsize = 10\n",
|
| 537 |
-
" \n",
|
| 538 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 539 |
-
" train_data_dir,\n",
|
| 540 |
-
" target_size=(img_rows, img_cols),\n",
|
| 541 |
-
" batch_size=train_batchsize,\n",
|
| 542 |
-
" class_mode='categorical')\n",
|
| 543 |
-
" \n",
|
| 544 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 545 |
-
" validation_data_dir,\n",
|
| 546 |
-
" target_size=(img_rows, img_cols),\n",
|
| 547 |
-
" batch_size=val_batchsize,\n",
|
| 548 |
-
" class_mode='categorical',\n",
|
| 549 |
-
" shuffle=False)\n",
|
| 550 |
-
"\n",
|
| 551 |
-
"# Re-loads the VGG16 model without the top or FC layers\n",
|
| 552 |
-
"vgg16 = VGG16(weights = 'imagenet', \n",
|
| 553 |
-
" include_top = False, \n",
|
| 554 |
-
" input_shape = (img_rows, img_cols, 3))\n",
|
| 555 |
-
"\n",
|
| 556 |
-
"# Freeze layers\n",
|
| 557 |
-
"for layer in vgg16.layers:\n",
|
| 558 |
-
" layer.trainable = False\n",
|
| 559 |
-
" \n",
|
| 560 |
-
"# Number of classes in the Flowers-17 dataset\n",
|
| 561 |
-
"num_classes = 17\n",
|
| 562 |
-
"\n",
|
| 563 |
-
"FC_Head = addTopModel(vgg16, num_classes)\n",
|
| 564 |
-
"\n",
|
| 565 |
-
"model = Model(inputs=vgg16.input, outputs=FC_Head)\n",
|
| 566 |
-
"\n",
|
| 567 |
-
"print(model.summary())"
|
| 568 |
-
]
|
| 569 |
-
},
|
| 570 |
-
{
|
| 571 |
-
"cell_type": "markdown",
|
| 572 |
-
"metadata": {},
|
| 573 |
-
"source": [
|
| 574 |
-
"### Training using 64 x 64 image size is MUCH faster!"
|
| 575 |
-
]
|
| 576 |
-
},
|
| 577 |
-
{
|
| 578 |
-
"cell_type": "code",
|
| 579 |
-
"execution_count": 17,
|
| 580 |
-
"metadata": {},
|
| 581 |
-
"outputs": [
|
| 582 |
-
{
|
| 583 |
-
"name": "stdout",
|
| 584 |
-
"output_type": "stream",
|
| 585 |
-
"text": [
|
| 586 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 587 |
-
" ...\n",
|
| 588 |
-
" to \n",
|
| 589 |
-
" ['...']\n",
|
| 590 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 591 |
-
" ...\n",
|
| 592 |
-
" to \n",
|
| 593 |
-
" ['...']\n",
|
| 594 |
-
"Train for 37 steps, validate for 5 steps\n",
|
| 595 |
-
"Epoch 1/5\n",
|
| 596 |
-
"36/37 [============================>.] - ETA: 0s - loss: 2.7539 - accuracy: 0.1343\n",
|
| 597 |
-
"Epoch 00001: val_loss improved from inf to 2.57583, saving model to flowers_vgg_64.h5\n",
|
| 598 |
-
"37/37 [==============================] - 18s 486ms/step - loss: 2.7530 - accuracy: 0.1375 - val_loss: 2.5758 - val_accuracy: 0.0800\n",
|
| 599 |
-
"Epoch 2/5\n",
|
| 600 |
-
"36/37 [============================>.] - ETA: 0s - loss: 2.5438 - accuracy: 0.2101\n",
|
| 601 |
-
"Epoch 00002: val_loss improved from 2.57583 to 2.45450, saving model to flowers_vgg_64.h5\n",
|
| 602 |
-
"37/37 [==============================] - 20s 537ms/step - loss: 2.5496 - accuracy: 0.2111 - val_loss: 2.4545 - val_accuracy: 0.2600\n",
|
| 603 |
-
"Epoch 3/5\n",
|
| 604 |
-
"36/37 [============================>.] - ETA: 0s - loss: 2.3475 - accuracy: 0.2934\n",
|
| 605 |
-
"Epoch 00003: val_loss improved from 2.45450 to 2.16252, saving model to flowers_vgg_64.h5\n",
|
| 606 |
-
"37/37 [==============================] - 18s 494ms/step - loss: 2.3422 - accuracy: 0.2939 - val_loss: 2.1625 - val_accuracy: 0.4400\n",
|
| 607 |
-
"Epoch 4/5\n",
|
| 608 |
-
"36/37 [============================>.] - ETA: 0s - loss: 2.2175 - accuracy: 0.3316\n",
|
| 609 |
-
"Epoch 00004: val_loss improved from 2.16252 to 2.07525, saving model to flowers_vgg_64.h5\n",
|
| 610 |
-
"37/37 [==============================] - 18s 473ms/step - loss: 2.2229 - accuracy: 0.3260 - val_loss: 2.0753 - val_accuracy: 0.4800\n",
|
| 611 |
-
"Epoch 5/5\n",
|
| 612 |
-
"36/37 [============================>.] - ETA: 0s - loss: 2.0746 - accuracy: 0.3681\n",
|
| 613 |
-
"Epoch 00005: val_loss improved from 2.07525 to 2.02786, saving model to flowers_vgg_64.h5\n",
|
| 614 |
-
"37/37 [==============================] - 16s 440ms/step - loss: 2.0630 - accuracy: 0.3694 - val_loss: 2.0279 - val_accuracy: 0.4000\n"
|
| 615 |
-
]
|
| 616 |
-
}
|
| 617 |
-
],
|
| 618 |
-
"source": [
|
| 619 |
-
"from tensorflow.keras.optimizers import RMSprop\n",
|
| 620 |
-
"from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau\n",
|
| 621 |
-
" \n",
|
| 622 |
-
"checkpoint = ModelCheckpoint(\"flowers_vgg_64.h5\",\n",
|
| 623 |
-
" monitor=\"val_loss\",\n",
|
| 624 |
-
" mode=\"min\",\n",
|
| 625 |
-
" save_best_only = True,\n",
|
| 626 |
-
" verbose=1)\n",
|
| 627 |
-
"\n",
|
| 628 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', \n",
|
| 629 |
-
" min_delta = 0, \n",
|
| 630 |
-
" patience = 5,\n",
|
| 631 |
-
" verbose = 1,\n",
|
| 632 |
-
" restore_best_weights = True)\n",
|
| 633 |
-
"\n",
|
| 634 |
-
"reduce_lr = ReduceLROnPlateau(monitor = 'val_loss',\n",
|
| 635 |
-
" factor = 0.2,\n",
|
| 636 |
-
" patience = 3,\n",
|
| 637 |
-
" verbose = 1,\n",
|
| 638 |
-
" min_delta = 0.00001)\n",
|
| 639 |
-
"\n",
|
| 640 |
-
"# we put our call backs into a callback list\n",
|
| 641 |
-
"callbacks = [earlystop, checkpoint, reduce_lr]\n",
|
| 642 |
-
"\n",
|
| 643 |
-
"# Note we use a very small learning rate \n",
|
| 644 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 645 |
-
" optimizer = RMSprop(lr = 0.0001),\n",
|
| 646 |
-
" metrics = ['accuracy'])\n",
|
| 647 |
-
"\n",
|
| 648 |
-
"nb_train_samples = 1190\n",
|
| 649 |
-
"nb_validation_samples = 170\n",
|
| 650 |
-
"epochs = 5\n",
|
| 651 |
-
"batch_size = 32\n",
|
| 652 |
-
"\n",
|
| 653 |
-
"history = model.fit_generator(\n",
|
| 654 |
-
" train_generator,\n",
|
| 655 |
-
" steps_per_epoch = nb_train_samples // batch_size,\n",
|
| 656 |
-
" epochs = epochs,\n",
|
| 657 |
-
" callbacks = callbacks,\n",
|
| 658 |
-
" validation_data = validation_generator,\n",
|
| 659 |
-
" validation_steps = nb_validation_samples // batch_size)\n",
|
| 660 |
-
"\n",
|
| 661 |
-
"model.save(\"flowers_vgg_64.h5\")"
|
| 662 |
-
]
|
| 663 |
-
},
|
| 664 |
-
{
|
| 665 |
-
"cell_type": "code",
|
| 666 |
-
"execution_count": null,
|
| 667 |
-
"metadata": {},
|
| 668 |
-
"outputs": [],
|
| 669 |
-
"source": []
|
| 670 |
-
}
|
| 671 |
-
],
|
| 672 |
-
"metadata": {
|
| 673 |
-
"kernelspec": {
|
| 674 |
-
"display_name": "Python 3",
|
| 675 |
-
"language": "python",
|
| 676 |
-
"name": "python3"
|
| 677 |
-
},
|
| 678 |
-
"language_info": {
|
| 679 |
-
"codemirror_mode": {
|
| 680 |
-
"name": "ipython",
|
| 681 |
-
"version": 3
|
| 682 |
-
},
|
| 683 |
-
"file_extension": ".py",
|
| 684 |
-
"mimetype": "text/x-python",
|
| 685 |
-
"name": "python",
|
| 686 |
-
"nbconvert_exporter": "python",
|
| 687 |
-
"pygments_lexer": "ipython3",
|
| 688 |
-
"version": "3.7.4"
|
| 689 |
-
}
|
| 690 |
-
},
|
| 691 |
-
"nbformat": 4,
|
| 692 |
-
"nbformat_minor": 2
|
| 693 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:8acdd1268c480f226cf962047183add29323d917b30897ee768d03f76f26c05d
|
| 3 |
+
size 26375
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16. Design Your Own CNN - LittleVGG/16.2 LittleVGG - Simpsons.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/18.2 Building an Emotion Detector with LittleVGG.ipynb
CHANGED
|
@@ -1,723 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Using LittleVGG for Emotion Detection"
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "markdown",
|
| 12 |
-
"metadata": {},
|
| 13 |
-
"source": [
|
| 14 |
-
"### Training Emotion Detector"
|
| 15 |
-
]
|
| 16 |
-
},
|
| 17 |
-
{
|
| 18 |
-
"cell_type": "code",
|
| 19 |
-
"execution_count": 8,
|
| 20 |
-
"metadata": {},
|
| 21 |
-
"outputs": [
|
| 22 |
-
{
|
| 23 |
-
"name": "stdout",
|
| 24 |
-
"output_type": "stream",
|
| 25 |
-
"text": [
|
| 26 |
-
"Found 28709 images belonging to 7 classes.\n",
|
| 27 |
-
"Found 3589 images belonging to 7 classes.\n"
|
| 28 |
-
]
|
| 29 |
-
}
|
| 30 |
-
],
|
| 31 |
-
"source": [
|
| 32 |
-
"from __future__ import print_function\n",
|
| 33 |
-
"import tensorflow as tf\n",
|
| 34 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 35 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 36 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten, BatchNormalization\n",
|
| 37 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
|
| 38 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 39 |
-
"import os\n",
|
| 40 |
-
"\n",
|
| 41 |
-
"num_classes = 7\n",
|
| 42 |
-
"img_rows, img_cols = 48, 48\n",
|
| 43 |
-
"batch_size = 16\n",
|
| 44 |
-
"\n",
|
| 45 |
-
"train_data_dir = './fer2013/train'\n",
|
| 46 |
-
"validation_data_dir = './fer2013/validation'\n",
|
| 47 |
-
"\n",
|
| 48 |
-
"# Let's use some data augmentaiton \n",
|
| 49 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 50 |
-
" rescale=1./255,\n",
|
| 51 |
-
" rotation_range=30,\n",
|
| 52 |
-
" shear_range=0.3,\n",
|
| 53 |
-
" zoom_range=0.3,\n",
|
| 54 |
-
" width_shift_range=0.4,\n",
|
| 55 |
-
" height_shift_range=0.4,\n",
|
| 56 |
-
" horizontal_flip=True,\n",
|
| 57 |
-
" fill_mode='nearest')\n",
|
| 58 |
-
" \n",
|
| 59 |
-
"validation_datagen = ImageDataGenerator(rescale=1./255)\n",
|
| 60 |
-
" \n",
|
| 61 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 62 |
-
" train_data_dir,\n",
|
| 63 |
-
" color_mode = 'grayscale',\n",
|
| 64 |
-
" target_size=(img_rows, img_cols),\n",
|
| 65 |
-
" batch_size=batch_size,\n",
|
| 66 |
-
" class_mode='categorical',\n",
|
| 67 |
-
" shuffle=True)\n",
|
| 68 |
-
" \n",
|
| 69 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 70 |
-
" validation_data_dir,\n",
|
| 71 |
-
" color_mode = 'grayscale',\n",
|
| 72 |
-
" target_size=(img_rows, img_cols),\n",
|
| 73 |
-
" batch_size=batch_size,\n",
|
| 74 |
-
" class_mode='categorical',\n",
|
| 75 |
-
" shuffle=True)"
|
| 76 |
-
]
|
| 77 |
-
},
|
| 78 |
-
{
|
| 79 |
-
"cell_type": "markdown",
|
| 80 |
-
"metadata": {},
|
| 81 |
-
"source": [
|
| 82 |
-
"## Our Keras Imports"
|
| 83 |
-
]
|
| 84 |
-
},
|
| 85 |
-
{
|
| 86 |
-
"cell_type": "code",
|
| 87 |
-
"execution_count": 9,
|
| 88 |
-
"metadata": {},
|
| 89 |
-
"outputs": [],
|
| 90 |
-
"source": [
|
| 91 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 92 |
-
"from tensorflow.keras.layers import BatchNormalization\n",
|
| 93 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
|
| 94 |
-
"from tensorflow.keras.layers import ELU\n",
|
| 95 |
-
"from tensorflow.keras.layers import Activation, Flatten, Dropout, Dense"
|
| 96 |
-
]
|
| 97 |
-
},
|
| 98 |
-
{
|
| 99 |
-
"cell_type": "markdown",
|
| 100 |
-
"metadata": {},
|
| 101 |
-
"source": [
|
| 102 |
-
"## Keras LittleVGG Model"
|
| 103 |
-
]
|
| 104 |
-
},
|
| 105 |
-
{
|
| 106 |
-
"cell_type": "code",
|
| 107 |
-
"execution_count": 10,
|
| 108 |
-
"metadata": {},
|
| 109 |
-
"outputs": [
|
| 110 |
-
{
|
| 111 |
-
"name": "stdout",
|
| 112 |
-
"output_type": "stream",
|
| 113 |
-
"text": [
|
| 114 |
-
"Model: \"sequential_1\"\n",
|
| 115 |
-
"_________________________________________________________________\n",
|
| 116 |
-
"Layer (type) Output Shape Param # \n",
|
| 117 |
-
"=================================================================\n",
|
| 118 |
-
"conv2d_8 (Conv2D) (None, 48, 48, 32) 320 \n",
|
| 119 |
-
"_________________________________________________________________\n",
|
| 120 |
-
"activation_11 (Activation) (None, 48, 48, 32) 0 \n",
|
| 121 |
-
"_________________________________________________________________\n",
|
| 122 |
-
"batch_normalization_10 (Batc (None, 48, 48, 32) 128 \n",
|
| 123 |
-
"_________________________________________________________________\n",
|
| 124 |
-
"conv2d_9 (Conv2D) (None, 48, 48, 32) 9248 \n",
|
| 125 |
-
"_________________________________________________________________\n",
|
| 126 |
-
"activation_12 (Activation) (None, 48, 48, 32) 0 \n",
|
| 127 |
-
"_________________________________________________________________\n",
|
| 128 |
-
"batch_normalization_11 (Batc (None, 48, 48, 32) 128 \n",
|
| 129 |
-
"_________________________________________________________________\n",
|
| 130 |
-
"max_pooling2d_4 (MaxPooling2 (None, 24, 24, 32) 0 \n",
|
| 131 |
-
"_________________________________________________________________\n",
|
| 132 |
-
"dropout_6 (Dropout) (None, 24, 24, 32) 0 \n",
|
| 133 |
-
"_________________________________________________________________\n",
|
| 134 |
-
"conv2d_10 (Conv2D) (None, 24, 24, 64) 18496 \n",
|
| 135 |
-
"_________________________________________________________________\n",
|
| 136 |
-
"activation_13 (Activation) (None, 24, 24, 64) 0 \n",
|
| 137 |
-
"_________________________________________________________________\n",
|
| 138 |
-
"batch_normalization_12 (Batc (None, 24, 24, 64) 256 \n",
|
| 139 |
-
"_________________________________________________________________\n",
|
| 140 |
-
"conv2d_11 (Conv2D) (None, 24, 24, 64) 36928 \n",
|
| 141 |
-
"_________________________________________________________________\n",
|
| 142 |
-
"activation_14 (Activation) (None, 24, 24, 64) 0 \n",
|
| 143 |
-
"_________________________________________________________________\n",
|
| 144 |
-
"batch_normalization_13 (Batc (None, 24, 24, 64) 256 \n",
|
| 145 |
-
"_________________________________________________________________\n",
|
| 146 |
-
"max_pooling2d_5 (MaxPooling2 (None, 12, 12, 64) 0 \n",
|
| 147 |
-
"_________________________________________________________________\n",
|
| 148 |
-
"dropout_7 (Dropout) (None, 12, 12, 64) 0 \n",
|
| 149 |
-
"_________________________________________________________________\n",
|
| 150 |
-
"conv2d_12 (Conv2D) (None, 12, 12, 128) 73856 \n",
|
| 151 |
-
"_________________________________________________________________\n",
|
| 152 |
-
"activation_15 (Activation) (None, 12, 12, 128) 0 \n",
|
| 153 |
-
"_________________________________________________________________\n",
|
| 154 |
-
"batch_normalization_14 (Batc (None, 12, 12, 128) 512 \n",
|
| 155 |
-
"_________________________________________________________________\n",
|
| 156 |
-
"conv2d_13 (Conv2D) (None, 12, 12, 128) 147584 \n",
|
| 157 |
-
"_________________________________________________________________\n",
|
| 158 |
-
"activation_16 (Activation) (None, 12, 12, 128) 0 \n",
|
| 159 |
-
"_________________________________________________________________\n",
|
| 160 |
-
"batch_normalization_15 (Batc (None, 12, 12, 128) 512 \n",
|
| 161 |
-
"_________________________________________________________________\n",
|
| 162 |
-
"max_pooling2d_6 (MaxPooling2 (None, 6, 6, 128) 0 \n",
|
| 163 |
-
"_________________________________________________________________\n",
|
| 164 |
-
"dropout_8 (Dropout) (None, 6, 6, 128) 0 \n",
|
| 165 |
-
"_________________________________________________________________\n",
|
| 166 |
-
"conv2d_14 (Conv2D) (None, 6, 6, 256) 295168 \n",
|
| 167 |
-
"_________________________________________________________________\n",
|
| 168 |
-
"activation_17 (Activation) (None, 6, 6, 256) 0 \n",
|
| 169 |
-
"_________________________________________________________________\n",
|
| 170 |
-
"batch_normalization_16 (Batc (None, 6, 6, 256) 1024 \n",
|
| 171 |
-
"_________________________________________________________________\n",
|
| 172 |
-
"conv2d_15 (Conv2D) (None, 6, 6, 256) 590080 \n",
|
| 173 |
-
"_________________________________________________________________\n",
|
| 174 |
-
"activation_18 (Activation) (None, 6, 6, 256) 0 \n",
|
| 175 |
-
"_________________________________________________________________\n",
|
| 176 |
-
"batch_normalization_17 (Batc (None, 6, 6, 256) 1024 \n",
|
| 177 |
-
"_________________________________________________________________\n",
|
| 178 |
-
"max_pooling2d_7 (MaxPooling2 (None, 3, 3, 256) 0 \n",
|
| 179 |
-
"_________________________________________________________________\n",
|
| 180 |
-
"dropout_9 (Dropout) (None, 3, 3, 256) 0 \n",
|
| 181 |
-
"_________________________________________________________________\n",
|
| 182 |
-
"flatten_1 (Flatten) (None, 2304) 0 \n",
|
| 183 |
-
"_________________________________________________________________\n",
|
| 184 |
-
"dense_3 (Dense) (None, 64) 147520 \n",
|
| 185 |
-
"_________________________________________________________________\n",
|
| 186 |
-
"activation_19 (Activation) (None, 64) 0 \n",
|
| 187 |
-
"_________________________________________________________________\n",
|
| 188 |
-
"batch_normalization_18 (Batc (None, 64) 256 \n",
|
| 189 |
-
"_________________________________________________________________\n",
|
| 190 |
-
"dropout_10 (Dropout) (None, 64) 0 \n",
|
| 191 |
-
"_________________________________________________________________\n",
|
| 192 |
-
"dense_4 (Dense) (None, 64) 4160 \n",
|
| 193 |
-
"_________________________________________________________________\n",
|
| 194 |
-
"activation_20 (Activation) (None, 64) 0 \n",
|
| 195 |
-
"_________________________________________________________________\n",
|
| 196 |
-
"batch_normalization_19 (Batc (None, 64) 256 \n",
|
| 197 |
-
"_________________________________________________________________\n",
|
| 198 |
-
"dropout_11 (Dropout) (None, 64) 0 \n",
|
| 199 |
-
"_________________________________________________________________\n",
|
| 200 |
-
"dense_5 (Dense) (None, 7) 455 \n",
|
| 201 |
-
"_________________________________________________________________\n",
|
| 202 |
-
"activation_21 (Activation) (None, 7) 0 \n",
|
| 203 |
-
"=================================================================\n",
|
| 204 |
-
"Total params: 1,328,167\n",
|
| 205 |
-
"Trainable params: 1,325,991\n",
|
| 206 |
-
"Non-trainable params: 2,176\n",
|
| 207 |
-
"_________________________________________________________________\n",
|
| 208 |
-
"None\n"
|
| 209 |
-
]
|
| 210 |
-
}
|
| 211 |
-
],
|
| 212 |
-
"source": [
|
| 213 |
-
"model = Sequential()\n",
|
| 214 |
-
"\n",
|
| 215 |
-
"model.add(Conv2D(32, (3, 3), padding = 'same', kernel_initializer=\"he_normal\",\n",
|
| 216 |
-
" input_shape = (img_rows, img_cols, 1)))\n",
|
| 217 |
-
"model.add(Activation('elu'))\n",
|
| 218 |
-
"model.add(BatchNormalization())\n",
|
| 219 |
-
"model.add(Conv2D(32, (3, 3), padding = \"same\", kernel_initializer=\"he_normal\", \n",
|
| 220 |
-
" input_shape = (img_rows, img_cols, 1)))\n",
|
| 221 |
-
"model.add(Activation('elu'))\n",
|
| 222 |
-
"model.add(BatchNormalization())\n",
|
| 223 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 224 |
-
"model.add(Dropout(0.2))\n",
|
| 225 |
-
"\n",
|
| 226 |
-
"# Block #2: second CONV => RELU => CONV => RELU => POOL\n",
|
| 227 |
-
"# layer set\n",
|
| 228 |
-
"model.add(Conv2D(64, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 229 |
-
"model.add(Activation('elu'))\n",
|
| 230 |
-
"model.add(BatchNormalization())\n",
|
| 231 |
-
"model.add(Conv2D(64, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 232 |
-
"model.add(Activation('elu'))\n",
|
| 233 |
-
"model.add(BatchNormalization())\n",
|
| 234 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 235 |
-
"model.add(Dropout(0.2))\n",
|
| 236 |
-
"\n",
|
| 237 |
-
"# Block #3: third CONV => RELU => CONV => RELU => POOL\n",
|
| 238 |
-
"# layer set\n",
|
| 239 |
-
"model.add(Conv2D(128, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 240 |
-
"model.add(Activation('elu'))\n",
|
| 241 |
-
"model.add(BatchNormalization())\n",
|
| 242 |
-
"model.add(Conv2D(128, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 243 |
-
"model.add(Activation('elu'))\n",
|
| 244 |
-
"model.add(BatchNormalization())\n",
|
| 245 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 246 |
-
"model.add(Dropout(0.2))\n",
|
| 247 |
-
"\n",
|
| 248 |
-
"# Block #4: third CONV => RELU => CONV => RELU => POOL\n",
|
| 249 |
-
"# layer set\n",
|
| 250 |
-
"model.add(Conv2D(256, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 251 |
-
"model.add(Activation('elu'))\n",
|
| 252 |
-
"model.add(BatchNormalization())\n",
|
| 253 |
-
"model.add(Conv2D(256, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 254 |
-
"model.add(Activation('elu'))\n",
|
| 255 |
-
"model.add(BatchNormalization())\n",
|
| 256 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 257 |
-
"model.add(Dropout(0.2))\n",
|
| 258 |
-
"\n",
|
| 259 |
-
"# Block #5: first set of FC => RELU layers\n",
|
| 260 |
-
"model.add(Flatten())\n",
|
| 261 |
-
"model.add(Dense(64, kernel_initializer=\"he_normal\"))\n",
|
| 262 |
-
"model.add(Activation('elu'))\n",
|
| 263 |
-
"model.add(BatchNormalization())\n",
|
| 264 |
-
"model.add(Dropout(0.5))\n",
|
| 265 |
-
"\n",
|
| 266 |
-
"# Block #6: second set of FC => RELU layers\n",
|
| 267 |
-
"model.add(Dense(64, kernel_initializer=\"he_normal\"))\n",
|
| 268 |
-
"model.add(Activation('elu'))\n",
|
| 269 |
-
"model.add(BatchNormalization())\n",
|
| 270 |
-
"model.add(Dropout(0.5))\n",
|
| 271 |
-
"\n",
|
| 272 |
-
"# Block #7: softmax classifier\n",
|
| 273 |
-
"model.add(Dense(num_classes, kernel_initializer=\"he_normal\"))\n",
|
| 274 |
-
"model.add(Activation(\"softmax\"))\n",
|
| 275 |
-
"\n",
|
| 276 |
-
"print(model.summary())"
|
| 277 |
-
]
|
| 278 |
-
},
|
| 279 |
-
{
|
| 280 |
-
"cell_type": "markdown",
|
| 281 |
-
"metadata": {},
|
| 282 |
-
"source": [
|
| 283 |
-
"## Training our model"
|
| 284 |
-
]
|
| 285 |
-
},
|
| 286 |
-
{
|
| 287 |
-
"cell_type": "code",
|
| 288 |
-
"execution_count": 12,
|
| 289 |
-
"metadata": {},
|
| 290 |
-
"outputs": [
|
| 291 |
-
{
|
| 292 |
-
"name": "stdout",
|
| 293 |
-
"output_type": "stream",
|
| 294 |
-
"text": [
|
| 295 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 296 |
-
" ...\n",
|
| 297 |
-
" to \n",
|
| 298 |
-
" ['...']\n",
|
| 299 |
-
"Train for 1795 steps\n",
|
| 300 |
-
"1795/1795 [==============================] - 607s 338ms/step - loss: 2.0255 - accuracy: 0.2012\n"
|
| 301 |
-
]
|
| 302 |
-
}
|
| 303 |
-
],
|
| 304 |
-
"source": [
|
| 305 |
-
"from tensorflow.keras.optimizers import RMSprop, SGD, Adam\n",
|
| 306 |
-
"from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau\n",
|
| 307 |
-
"\n",
|
| 308 |
-
" \n",
|
| 309 |
-
"checkpoint = ModelCheckpoint(\"emotion_little_vgg.h5\",\n",
|
| 310 |
-
" monitor=\"val_loss\",\n",
|
| 311 |
-
" mode=\"min\",\n",
|
| 312 |
-
" save_best_only = True,\n",
|
| 313 |
-
" verbose=1)\n",
|
| 314 |
-
"\n",
|
| 315 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', \n",
|
| 316 |
-
" min_delta = 0, \n",
|
| 317 |
-
" patience = 3,\n",
|
| 318 |
-
" verbose = 1,\n",
|
| 319 |
-
" restore_best_weights = True)\n",
|
| 320 |
-
"\n",
|
| 321 |
-
"reduce_lr = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.2, patience = 3, verbose = 1, min_delta = 0.0001)\n",
|
| 322 |
-
"\n",
|
| 323 |
-
"# we put our call backs into a callback list\n",
|
| 324 |
-
"callbacks = [earlystop, checkpoint] #reduce_lr]\n",
|
| 325 |
-
"\n",
|
| 326 |
-
"# We use a very small learning rate \n",
|
| 327 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 328 |
-
" optimizer = Adam(lr=0.001),\n",
|
| 329 |
-
" metrics = ['accuracy'])\n",
|
| 330 |
-
"\n",
|
| 331 |
-
"nb_train_samples = 28273\n",
|
| 332 |
-
"nb_validation_samples = 3534\n",
|
| 333 |
-
"epochs = 5\n",
|
| 334 |
-
"\n",
|
| 335 |
-
"history = model.fit(\n",
|
| 336 |
-
" train_generator,\n",
|
| 337 |
-
" epochs = epochs)\n",
|
| 338 |
-
" callbacks = callbacks)"
|
| 339 |
-
]
|
| 340 |
-
},
|
| 341 |
-
{
|
| 342 |
-
"cell_type": "code",
|
| 343 |
-
"execution_count": 15,
|
| 344 |
-
"metadata": {},
|
| 345 |
-
"outputs": [
|
| 346 |
-
{
|
| 347 |
-
"name": "stdout",
|
| 348 |
-
"output_type": "stream",
|
| 349 |
-
"text": [
|
| 350 |
-
"Found 3589 images belonging to 7 classes.\n",
|
| 351 |
-
"Confusion Matrix\n",
|
| 352 |
-
"[[ 0 0 0 439 0 52 0]\n",
|
| 353 |
-
" [ 0 0 0 52 0 3 0]\n",
|
| 354 |
-
" [ 0 0 0 486 0 42 0]\n",
|
| 355 |
-
" [ 0 0 0 790 0 89 0]\n",
|
| 356 |
-
" [ 0 0 0 565 0 61 0]\n",
|
| 357 |
-
" [ 0 0 0 496 0 98 0]\n",
|
| 358 |
-
" [ 0 0 0 401 0 15 0]]\n",
|
| 359 |
-
"Classification Report\n",
|
| 360 |
-
" precision recall f1-score support\n",
|
| 361 |
-
"\n",
|
| 362 |
-
" Angry 0.00 0.00 0.00 491\n",
|
| 363 |
-
" Disgust 0.00 0.00 0.00 55\n",
|
| 364 |
-
" Fear 0.00 0.00 0.00 528\n",
|
| 365 |
-
" Happy 0.24 0.90 0.38 879\n",
|
| 366 |
-
" Neutral 0.00 0.00 0.00 626\n",
|
| 367 |
-
" Sad 0.27 0.16 0.21 594\n",
|
| 368 |
-
" Surprise 0.00 0.00 0.00 416\n",
|
| 369 |
-
"\n",
|
| 370 |
-
" accuracy 0.25 3589\n",
|
| 371 |
-
" macro avg 0.07 0.15 0.08 3589\n",
|
| 372 |
-
"weighted avg 0.10 0.25 0.13 3589\n",
|
| 373 |
-
"\n"
|
| 374 |
-
]
|
| 375 |
-
},
|
| 376 |
-
{
|
| 377 |
-
"name": "stderr",
|
| 378 |
-
"output_type": "stream",
|
| 379 |
-
"text": [
|
| 380 |
-
"C:\\ProgramData\\Anaconda3\\envs\\cv\\lib\\site-packages\\sklearn\\metrics\\classification.py:1437: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples.\n",
|
| 381 |
-
" 'precision', 'predicted', average, warn_for)\n"
|
| 382 |
-
]
|
| 383 |
-
},
|
| 384 |
-
{
|
| 385 |
-
"data": {
|
| 386 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAekAAAHHCAYAAACbaKDRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3debglVXnv8e8PZFAREBCCgOKAUxwQUCESgxoTp4hjnHJFQtLG2Rhzg8NVork3Js6axKQjKniNI0FwuAoiOJAANtiCiCIiSgcCNggOINrd7/2j6tDb9gzd5+yza1fx/TxPPadqVe3aa/dw3v2uWkOqCkmSNH226roCkiRpdgZpSZKmlEFakqQpZZCWJGlKGaQlSZpSBmlJkqbUrbqugCRJS/X7D79tXXPt+rHf99zzb/pcVT167DfeTAZpSVLvXXPtes753J3Gft+t9/zObmO/6RYwSEuSeq+ADWzouhpj5zNpSZKmlJm0JGkAivVlJi1JkibETFqS1HvNM+nhLRhlkJYkDYIdxyRJ0sSYSUuSeq8o1tfwmrvNpCVJmlJm0pKkQbDjmCRJU6iA9QMM0jZ3S5I0pcykJUmDMMTmbjNpSZKmlJm0JKn3CgY5BMsgLUkahOHNN2ZztyRJU8tMWpLUe0U5BEuSJE2OmbQkqf8K1g8vkTaTliRpWplJS5J6rxhm726DtCRpAMJ60nUlxs7mbkmSppSZtCSp9wrYYMcxSZI0KWbSkqRBGOIzaYO0JKn3imEGaZu7JUmaUmbSkqRB2FBm0pIkaULMpCVJvTfUZ9IGaUlS7xVh/QAbh4f3iSRJGggzaUnSINhxTJIkTYyZtCSp9+w4JknS1Arra3iNw8P7RJIkDYSZtCSp9wrYMMC8c3ifSJKkgTCTliQNwhA7jplJS5I0pcykJUm9V2XvbkmSptYGMvZtIUnumWT1yPbjJC9LskuSU5N8p/15+/b6JHlnkkuSnJ/kgPnub5CWJGmRqurbVbV/Ve0PHAjcAJwIHA2cVlX7Aae1xwCPAfZrtxXAu+e7v0FaktR7zYxjW41920KPBL5bVd8HDgeOa8uPA57Y7h8OHF+Ns4Cdk+w51w0N0pIkzW23JKtGthXzXPsM4EPt/h5VdSVA+3P3tnwv4PKR16xpy2ZlxzFJ0gAsW8extVV10ILvnmwLPAF45UKXzlJWc11skJYk9d4UzDj2GOC8qrqqPb4qyZ5VdWXbnH11W74G2GfkdXsDV8x1U5u7JUlaumeysakb4GTgiHb/COCkkfLntL28Dwaun2kWn42ZtCRpENZXNzOOJbkN8CjgeSPFbwQ+muQo4AfA09ryzwCPBS6h6Ql+5Hz3NkhLkrQEVXUDsOsmZdfQ9Pbe9NoCXri59zZIS5J6r8hihkxNPYO0JGkQNjgtqCRJmhQzaUlS783MODY0w/tEkiQNhJm0JKn3inQ2BGs5LUsmneRJSSrJvZbj/pIk3RIsV3P3M4Gv0Ew2vmRJzPglSfPawFZj37o29uCXZAfgocDDaaY/OybJYcAxwFrgvsC5wB9VVSV5LPDW9tx5wF2r6vFJjgHuCOwLrE2yD/Diqlrdvs+ZwPOr6vxxfwZJUr9UsVwLbHRqOTLUJwKfraqLk1yb5IC2/IHAb9JMJH4m8NAkq4B/AR5WVd9L8qFN7nUgcGhV3ZjkCOC5wMuS3APYbq4A3S4ltgJga7Y+8DbsOOaPKG10051v03UVxma779/QdRXGKrcaTiNcrVvXdRXG4uf8jF/UTcN7eLxMluNf8DOBt7f7H26PPw2cU1VrAJKspsmQfwpcWlXfa6//EG1wbZ1cVTe2+x8D/leSvwT+GHj/XBWoqpXASoAds0s9JL82M5s0Nhe/dsFV7HrjHn+yqusqjNXWu+2+8EU9sf6qqxe+qAfOrtOW6c5hw6yrQPbbWIN0kl2BRwD3TVLA1jTD1z4D3DRy6fr2vRf6E/3ZzE5V3ZDkVOBw4A+B4fxmlCRpFuPOpJ8KHF9VN68EkuSLwKFzXP8t4K5J9q2qy4CnL3D/9wCfBL5cVdeOob6SpAEofCa9OZ5JszzXqBOA5wPf3fTi9lnzC4DPJlkLnDPfzavq3CQ/Bt43pvpKkgZiiDOOjTVIV9Vhs5S9E3jnJmUvGjk8varulSTAPwKr2muO2fReSe5IM2zslPHVWpKk6TQNXzv+tO1IdiGwE01v71+T5DnA2cCrq2rDBOsnSZpyRdhQ49+61vn4hKp6G/C2zbjueOD45a+RJEnTofMgLUnSOPhMWpKkKVTAhgH27h7eJ5IkaSDMpCVJAxDWD3DGMTNpSZKmlJm0JKn3fCYtSZImykxakjQIQ3wmbZCWJPVeVWzuliRJk2MmLUkahCEuVTm8TyRJ0kCYSUuSeq+ADXYckyRpGsXmbkmSNDlm0pKk3mtmHLO5W9Im7v3qH3RdhbFZ33UFxmz9VVd3XQVpSQzSkqRBWD/AJ7gGaUlS7xUZZHP38L52SJI0EGbSkqRB2DDAvHN4n0iSpIEwk5Yk9V4VrPeZtCRJmhQzaUnSIAyxd7dBWpLUe80QrOE1Dg/vE0mSNBBm0pKkQVg/wKUqzaQlSZpSBmlJUu/NrII17m1zJNk5yceTfCvJRUkOSbJLklOTfKf9efv22iR5Z5JLkpyf5ID57m2QliQNQNNxbNzbZnoH8NmquhfwAOAi4GjgtKraDzitPQZ4DLBfu60A3j3fjQ3SkiQtUpIdgYcBxwJU1S+q6jrgcOC49rLjgCe2+4cDx1fjLGDnJHvOdX+DtCRpEDaQsW+b4a7AD4H3JflakvckuS2wR1VdCdD+3L29fi/g8pHXr2nLZmWQliRpbrslWTWyrdjk/K2AA4B3V9UDgZ+xsWl7NrNF/prrYodgSZJ6bxnn7l5bVQfNc34NsKaqzm6PP04TpK9KsmdVXdk2Z189cv0+I6/fG7hirpubSUuSBqGLjmNV9d/A5Unu2RY9EvgmcDJwRFt2BHBSu38y8Jy2l/fBwPUzzeKzMZOWJGlpXgx8MMm2wKXAkTRJ8EeTHAX8AHhae+1ngMcClwA3tNfOySAtSeq9Zu7ubmYcq6rVwGxN4o+c5doCXri597a5W5KkKWUmLUkahM0cMtUrW5xJJ1mfZHWSC5N8PcnLk2zVnjsoyTvHX81fq8O+SZ613O8jSVKXFpNJ31hV+wMk2R34N2An4HVVtQpYNcb6zWVf4Fnte0uSbuFm5u4emiU9k66qq2nmHn1R2538sCSfAkjyO23GvbqdheV2SbZK8k9tFv6pJJ9J8tT2+suS7NbuH5TkjLnuA7wR+O227M+X8hkkScPQ4dzdy2bJz6Sr6tK2uXv3TU69AnhhVZ2ZZAfg58CTabLg+7XXXwS8d4G3mO0+RwOvqKrHz/aCdkaYFQDbc5tFfS5Jkro2rq8Js7UxnAm8NclLgJ2rah1wKPCxqtrQDgA/fTPuPdt95lVVK6vqoKo6aBu224KPIUnqpWVYpnIams+XHKST3BVYz8YpzwCoqjcCfwLcGjgryb2YPZjPWDdSn+0XuI8kSYO3pCCd5A7APwP/0A7QHj13t6q6oKr+jqYz2b2ArwBPaZ9N7wEcNvKSy4AD2/2nLHCfnwC3W0rdJUnDUXS2CtayWswz6VsnWQ1sQ5P9fgB46yzXvSzJw2my7G8C/w/4Jc0MLN8ALgbOBq5vr/9r4Ngkr2rL57vPBmBdkq8D76+qty3ic0iSBmQamqfHbYuDdFVtPc+5M4Az2v0Xz3ZNkldU1U+T7AqcA1zQXv9l4B6z3HPW+zDLdGuSJA1JFzOOfSrJzsC2wBvaDmSSJC3aUMdJTzxIV9Vhk35PSZL6yLm7JUmDYCYtSdIU6nKpyuXU/ZxnkiRpVmbSkqRBmIZxzeNmJi1J0pQyk5Yk9V8Ns+OYmbQkSVPKTFqS1HtOZiJJ0hQbYpC2uVuSpCllJi1J6j0nM5EkSRNlJi1JGoQaYCZtkJYkDYIzjkmSpIkxk5Yk9V4545gkSZokM2lpiS591x5dV2Fs7vyHV3ddhbHaetdduq7C2Ky/5tquqzD17DgmSdJUcpy0JEmaIDNpSdIgDLG520xakqQpZSYtSeq9oS5VaSYtSdKUMpOWJPVfNROaDI1BWpI0CM7dLUmSJsZMWpLUe4VDsCRJ0gSZSUuSBmCY04IapCVJgzDE3t02d0uSNKXMpCVJg2DHMUmS9CuSXJbkgiSrk6xqy3ZJcmqS77Q/b9+WJ8k7k1yS5PwkB8x3b4O0JKn3qppMetzbFnh4Ve1fVQe1x0cDp1XVfsBp7THAY4D92m0F8O75bmqQliQNwobK2LclOBw4rt0/DnjiSPnx1TgL2DnJnnPdxCAtSdLSFHBKknOTrGjL9qiqKwHan7u35XsBl4+8dk1bNis7jkmSBmGZhmDtNvOcubWyqlZucs1Dq+qKJLsDpyb51jz3my09n7PmBmlJkua2duQ586yq6or259VJTgQeDFyVZM+qurJtzr66vXwNsM/Iy/cGrpjr3jZ3S5IGoYuOY0lum+R2M/vA7wHfAE4GjmgvOwI4qd0/GXhO28v7YOD6mWbx2Uw0k06yHrhgpOiJVXXZJOsgSRqeYot7Y4/LHsCJSaCJqf9WVZ9N8lXgo0mOAn4APK29/jPAY4FLgBuAI+e7+aSbu2+sqv3HdbM0fyqpqg3juqckSZurqi4FHjBL+TXAI2cpL+CFm3v/zpu7k2yd5E1JvtoO7H5eW75DktOSnNcOEj+8Ld83yUVJ/gk4j19t25ck3ULVMmxdm3Qmfeskq9v971XVk4CjaNrkH5RkO+DMJKfQdFF/UlX9OMluwFlJTm5fe0/gyKp6wWxv0naBXwGwPbdZzs8jSdKymYbm7t8D7p/kqe3xTjQzsawB/k+ShwEbaMaR7dFe8/12EPis2u7xKwF2zC7T8GVIkrScaphzd0/DEKwAL66qz/1KYfJc4A7AgVX1yySXAdu3p3820RpKktSBzp9JA58Dnp9kG4Ak92i7se8EXN0G6IcDd+6ykpKkKTfAh9LTkEm/B9gXOK/trf1DmjlOPwh8sp3pZTUw3wwukqRbOJu7l6iqdpilbAPwqnbb1CFz3Oq+46yXJEnTaBoyaUmSlmyZ5u7u1DQ8k5YkSbMwk5Yk9V7hM2lJkqZTAQMM0jZ3S5I0pcykJUmDYMcxSZI0MWbSkqRhGGAmbZCWJA1ABtm72+ZuSZKmlJm0JGkYBtjcbSYtSdKUMpOWJPVfDXPGMTNpSZKmlJm0JGkYBvhM2iAtSRoIm7slSdKEmElLkoZhgM3dZtKSJE0pM2lpib516Ae6rsLY/D77d12F8dr19l3XYHyuubbrGky/AWbSBmlJUv8V4DhpSZI0KWbSkqRBqAE2d5tJS5I0pcykJUnDMMBM2iAtSRoGO45JkqRJMZOWJA1CBtjcbSYtSdKUMpOWJPVfMciOY2bSkiRNKTNpSdIAZJC9uw3SkqRhsLlbkiRNipm0JGkYzKQlSdKkmElLkoZhgJm0QVqS1H/FIHt329wtSdKUMkhLkgYhNf5ts9432TrJ15J8qj2+S5Kzk3wnyUeSbNuWb9ceX9Ke33ehexukJUlampcCF40c/x3wtqraD/gRcFRbfhTwo6q6O/C29rp5GaQlScNQy7AtIMnewOOA97THAR4BfLy95Djgie3+4e0x7flHttfPaWxBOslPNzl+bpJ/GNf9JUmaQm8H/iewoT3eFbiuqta1x2uAvdr9vYDLAdrz17fXz8lMWpKkue2WZNXItmLmRJLHA1dX1bkj18+WGddmnJvVRIZgJfkD4DXAtsA1wLOr6qokxwB3o/l2sQ/w91X1r0kOA17fXntP4EvAC4AjgftW1Z+39/1T4N5V9fJJfA5J0vTa3I5eW2htVR00x7mHAk9I8lhge2BHmsx65yS3arPlvYEr2uvX0MS6NUluBewEXDvfm48zk751ktUzG02QnfEV4OCqeiDwYZqmgRn3p2nPPwR4bZI7tuUPBv4CuB9NIH9y+9onJNmmveZI4H2bViTJiplvPb/kpvF9QkmSWlX1yqrau6r2BZ4BfKGqng2cDjy1vewI4KR2/+T2mPb8F6pqYpn0jVW1/8xBkucCM98+9gY+kmRPmmz6eyOvO6mqbgRuTHI6TXC+Djinqi5t7/Uh4NCq+niSLwCPT3IRsE1VXbBpRapqJbASYMfsMsA5aCRJv2Z6JjP5K+DDSf4G+BpwbFt+LPCBJJfQZNDPWOhGk5px7F3AW6vq5LYp+5iRc5sG0Vqg/D3Aq4BvMUsWLUnSpFXVGcAZ7f6lNAnnptf8HHjaltx3Uh3HdgL+q90/YpNzhyfZPsmuwGHAV9vyB7cDwrcCnk7TZE5VnU3Tpv8s4EPLXXFJUg8sx/CrKWiHnVSQPgb4WJIvA2s3OXcO8GngLOANVTXzgP0/gTcC36BpHj9x5DUfBc6sqh8tZ6UlST0ywCA9tubuqtphk+P3A+9v909i44PzTV1cVStmKb+hqp4+x2sOpZmtRZKkwerVOOkkOye5mKaT2mld10eSND26mrt7OXW6VGVVHTNH+Rm0D+A3Kb8OuMeyVkqSpCnhetKSpGGYgsx33AzSkqRhGGCQ7tUzaUmSbknMpCVJvTctHb3GzUxakqQpZSYtSRqG6Zm7e2wM0pKkYbC5W5IkTYqZtCRpEOw4JkmSJsZMWpI0DGbSkiRpUsykJUn9N9DJTAzSkqRhGGCQtrlbkqQpZSYtSRoGM2lJkjQpZtLSEj3hO4/uugpj9N9dV2Cs8rMbu66CJmiIHcfMpCVJmlIGaUmSppTN3ZKkYbC5W5IkTYqZtCSp/5xxTJKkKTbAIG1ztyRJU8pMWpI0DGbSkiRpUsykJUm9F4bZccxMWpKkKWUmLUkahgFm0gZpSVL/DXSctM3dkiRNKTNpSdIwmElLkqRJMZOWJA3DADNpg7QkaRDsOCZJkibGTFqSNAxm0pIkaVLMpCVJ/VeYSc9IUkneMnL8iiTHLPJeOyd5wSJfe1mS3RbzWknSsKTGv3Vtsc3dNwFPHlOA3BmYNUgn2XoM95ckaeySbJ/knCRfT3Jhkr9uy++S5Owk30nykSTbtuXbtceXtOf3Xeg9Fhuk1wErgT+fpdJ3SHJCkq+220Pb8mOSvGLkum+0FXwjcLckq5O8KclhSU5P8m/ABe21n0hybvuHsGKRdZYkDVktwza/m4BHVNUDgP2BRyc5GPg74G1VtR/wI+Co9vqjgB9V1d2Bt7XXzWspHcf+EXh2kp02KX9HW7kHAU8B3rPAfY4GvltV+1fVX7ZlDwZeXVX3aY//uKoOBA4CXpJk1/lumGRFklVJVv2Sm7bkM0mStFmq8dP2cJt2K+ARwMfb8uOAJ7b7h7fHtOcfmSTzvceiO45V1Y+THA+8BLhx5NTvAvcZed8dk9xuC29/TlV9b+T4JUme1O7vA+wHXDNP3VbSZPrsmF2m4KmCJGm5dfEMuX0sey5wd5rk9bvAdVW1rr1kDbBXu78XcDlAVa1Lcj2wK7B2rvsvtXf324HzgPeNlG0FHFJVo4GbJOv41cx9+3nu+7OR1x1GE/gPqaobkpyxwGslSRqX3ZKsGjle2SaCAFTVemD/JDsDJwL3nuUeM18fZsua5/1qsaRx0lV1LfBRNra3A5wCvGjmIMn+7e5lwAFt2QHAXdrynwDzZdo70bTh35DkXsDBS6mzJGmglueZ9NqqOmhkW8ksquo64AyaGLVzkpkkeG/ginZ/DU1rMO35nYBr5/tI45jM5C3AaC/vlwAHJTk/yTeBP2vLTwB2SbIaeD5wMUBVXQOc2XYke9Ms9/8scKsk5wNvAM4aQ50lSUOyHAF6gebztqP0zu3+rWlafS8CTgee2l52BHBSu39ye0x7/gtVNe+7LKq5u6p2GNm/CrjNyPFa4OmzvOZG4PfmuN+zNik6Y+TcTcBj5njdvltQbUmSxmlP4Lj2ufRWwEer6lNtgvrhJH8DfA04tr3+WOADSS6hyaCfsdAbOOOYJKn3wuwPfJdTVZ0PPHCW8ktpRiltWv5z4Glb8h7O3S1J0pQyk5YkDcMAB9wapCVJgzANc22Pm83dkiRNKTNpSdIwmElLkqRJMZOWJA3DADNpg7Qkqf/KjmOSJGmCzKQlScNgJi1JkibFTFqSNAg+k5YkSRNjJi1JGoYBZtIGaUnSINjcLUmSJsZMWpLUf8Ugm7vNpCVJmlJm0tISfXftrl1XYWz25r+7rsJY1U47dF2F8fmvrivQAwPMpA3SkqTeC3YckyRJE2QmLUkaBjNpSZI0KWbSkqRBSA0vlTZIS5L6z3HSkiRpksykJUmD4BAsSZI0MWbSkqRhGGAmbZCWJA2Czd2SJGlizKQlScNgJi1JkibFTFqS1H/lM2lJkjRBZtKSpGEYYCZtkJYk9V6wuVuSJE2QmbQkaRgGuFSlmbQkSVPKTFqSNAg+kx6zJK9OcmGS85OsTvKQzXzdvkm+sdz1kyT1RC3T1rHOMukkhwCPBw6oqpuS7AZs21V9JEmaNl02d+8JrK2qmwCqai1AktcCfwDcGvgP4HlVVUkOBN4L3AB8pZsqS5KmVTZ0XYPx67K5+xRgnyQXJ/mnJL/Tlv9DVT2oqu5LE6gf35a/D3hJVR2y0I2TrEiyKsmqX3LT8tRekqRl1lmQrqqfAgcCK4AfAh9J8lzg4UnOTnIB8AjgN5PsBOxcVV9sX/6BBe69sqoOqqqDtmG75fsQkqTp4TPp8aqq9cAZwBltUH4ecH/goKq6PMkxwPY0k8lMwR+XJGladdG7O8k+wPHAbwAbgJVV9Y4kuwAfAfYFLgP+sKp+lCTAO4DH0jy+fW5VnTfX/TvLpJPcM8l+I0X7A99u99cm2QF4KkBVXQdcn+TQ9vyzJ1dTSZLmtA74i6q6N3Aw8MIk9wGOBk6rqv2A09pjgMcA+7XbCuDd8928y0x6B+BdSXam+ZCX0FT4OuACmm8eXx25/kjgvUluAD432apKkqZa0cmMY1V1JXBlu/+TJBcBewGHA4e1lx1H02r8V2358VVVwFlJdk6yZ3ufX9NZkK6qc4HfmuXUa9pttusfMFJ0zPLUTJKkLZdkX+CBwNnAHjOBt6quTLJ7e9lewOUjL1vTlk1XkJYkaZyW6Zn0bklWjRyvrKqVv/bezSPaE4CXVdWPm0fPs5rtxJw1N0hLkjS3tVV10HwXJNmGJkB/sKr+vS2+aqYZO8mewNVt+Rpgn5GX7w1cMde9XWBDkjQMHQzBantrHwtcVFVvHTl1MnBEu38EcNJI+XPSOBi4fq7n0WAmLUkagNDZAhsPBf4HcEGS1W3Zq4A3Ah9NchTwA+Bp7bnP0Ay/uoRmCNaR893cIC1J0iJV1VeY/TkzwCNnub6AF27u/Q3SkqT+q+pkCNZy85m0JElTykxakjQIHT2TXlYGaUnSMAwwSNvcLUnSlDKTliQNwhCbu82kJUmaUmbSkqT+K2DD8FJpg7QkaRiGF6Nt7pYkaVqZSUuSBsGOY5IkaWLMpCVJw+Dc3ZIkaVLMpKUl+vnlt+u6CprDhu9c1nUVNEFDfCZtkJYk9V/hECxJkjQ5ZtKSpN4LEDuOSZKkSTGTliQNw4auKzB+BmlJ0iDY3C1JkibGTFqS1H8OwZIkSZNkJi1JGoAa5NzdBmlJ0iAMcVpQm7slSZpSZtKSpGEYYHO3mbQkSVPKTFqS1H8FGeCMY2bSkiRNKTNpSdIwDPCZtEFakjQMw4vRNndLkjStzKQlSYPgKliSJGlizKQlScNwS82kk7w6yYVJzk+yOslDlqMyST6TZOfluLckacAK2LAMW8cWzKSTHAI8Hjigqm5Kshuw7ebcPMmtqmrdZlwXIFX12M25ryRJtwSbk0nvCaytqpsAqmptVV2R5LI2YJPkoCRntPvHJFmZ5BTg+CTPTXJSks8m+XaS17XX7ZvkoiT/BJwH7DNzzyS3TfLpJF9P8o0kT29fc2CSLyY5N8nnkuw5/j8SSVLfhCI1/q1rm/NM+hTgtUkuBj4PfKSqvrjAaw4EDq2qG5M8F3gwcF/gBuCrST4NrAXuCRxZVS8AaBJqAB4NXFFVj2vLd0qyDfAu4PCq+mEbuP838MebvnmSFcCK9vCnn6+Pf3szPudS7EbzeYbAz7KlXvrxZX8LJvRZLlvuN2hM7t/YL5b9Hfz/suXuPIH3GIwFg3RV/TTJgcBvAw8HPpLk6AVednJV3ThyfGpVXQOQ5N+BQ4FPAN+vqrNmef0FwJuT/B3wqar6cpL70gT6U9tgvjVw5Rx1XgmsXOizjUuSVVV10KTebzn5WaaTn2U6+VmmzBRkvuO2Wb27q2o9cAZwRpILgCOAdWxsLt9+k5f8bNNbzHG86XUz73dx+8XgscDftk3nJwIXVtUhm1NnSdItzACD9ILPpJPcM8l+I0X7A9+naRk7sC17ygK3eVSSXZLcGngicOYC73lH4Iaq+r/Am4EDgG8Dd2g7spFkmyS/uVD9JUnqq83JpHcA3tUOjVoHXELzvPfewLFJXgWcvcA9vgJ8ALg78G9VtSrJvvNcfz/gTUk2AL8Enl9Vv0jyVOCdSXZq6/524MLN+AzLbWJN6xPgZ5lOfpbp5GeZFjNDsAYmtczNA23HsYOq6kXL+kaSpFusnW5zxzr4nn869vuesvr153b5rN5pQSVJg9DFEKwk701ydZJvjJTtkuTUJN9pf96+LU+Sdya5pJ0c7ICF7r/sQbqq3m8WLUkaqPfTDBsedTRwWlXtB5zWHgM8Btiv3VYA717o5mbSkqRhqBr/tuBb1peAazcpPhw4rt0/jqbD9Ez58dU4C9h5oUm5DNKL0I7ZHowkL92csmnXNiXt03U9JHVhGQL04vts7VFVVwK0P3dvy/cCLh+5bk1bNieD9OL8c5JzkrxgIAuCHDFL2XMnXYmlqqYX5Ce6rse4JHlz34cZts/m5ty6rt+WSHJB+xxx1q3r+i1Gkj2SHJvk/7XH90lyVNf1mjK7JVk1sq1Y+CVzyixl834TcKnKRaiqQ9ux438MrEpyDvC+qjq146ptkSTPBJ4F3CXJySOndgSu6aZWS3ZWkgdV1Ve7rsgYfAtYmeRWwPuAD1XV9R3XaUudS/NLaK5fTnedbJJWAskAAA4rSURBVHWW5PHtzxe2Pz/Q/nw2zZTHffR+mn9br26PLwY+AhzbVYUWrViuyUzWLqJ391VJ9qyqK9vm7Kvb8jXAaGvf3sAV893IIL1IVfWdJK8BVgHvBB6YZr7SV1XVv3dbu832HzRTq+4GvGWk/CdALzMDmqlrn5fk+zQz2oUmyb5/t9XaclX1HuA9Se4JHAmcn+RM4F+r6vRua7d5quouXddhXKrq+wBJHlpVDx05dXT79/L6bmq2JLtV1UeTvBKgqtYlWd91pQbgZJoWyje2P08aKX9Rkg8DDwGun2kWn4tBehGS3J/ml+bjgFOBP6iq89qZ0v4T6EWQbn/pfD/J7wI3VtWGJPcA7kUzf3ofPabrCoxTkq1p/j7uRbP4wdeBlyd5XlU9o9PKbaF2GMp+jEwj3Ha66ZvbJjm0qr4CkOS3gNt2XKfF+lmSXWmbXJMcDPSttWajDiYzSfIh4DCaZvE1wOtogvNH20cHPwCe1l7+GZrpri+haX05cqH7G6QX5x+Af6XJmm9eSKRdwvM13VVr0b4E/Hb7S/Q0mtaBp9M04/XKSLazO78+p3yvJHkr8ASav5P/U1XntKf+Lslyr+w2Vkn+BHgpTfPeauBgmi+0j+iyXot0FPDeduZDgOuYZTW+nng5TXZ3t7Y14A7AU7ut0uJ1sbRkVT1zjlOPnOXaYuPjks1ikN5CbWZzeVV9YLbzc5VPuVTVDe23vndV1d8n+VrXlVqMJE+gabq/I81zoDsDFwF97ID1DeA1VTXb884HT7oyS/RS4EHAWVX18CT3Av664zotSlWdCzwgyY40/3d6m3m2LYC/Q7NscIBvV9UvO66WRhikt1BVrU+ya5Jtq2r5V6udjLQLlzybJkuA/v7beANNlvb5qnpgkocDc33TnXbvA56U5FCa5sivVNWJAD0MDD+vqp8nIcl2VfWt9ll7LyV5HM0Xv+2brihQVb17Jp3kacBnq+rCthXwgCR/U1XndV23RRngKlh9/UXcte8DZ7Y9om9ebrOq3tpdlZbkZcArgRPb/6x3BXrRMWkWv6yqa5JslWSrqjo9zbrkffSPNIvSfKg9fl6S362qLWoumxJr2uGKn6BZE/5HLNCrdVol+WfgNjSdFN9D0zx8zrwvml7/q6o+1n4R/H2aVQffTdOpSVPAIL04V7TbVsDtOq7LklXVF4EvjhxfCrykuxotyXVJdgC+DHwwydU0q7f10e8A922fY5HkOHraoa+qntTuHpPkdGAn4LMdVmkpfquq7p/k/Kr66yRvoSedRWcx05P7ccC7q+qkJMd0WJ/FK2CDmbSAqurls7S5tL80f+1fd1X1sVPP4cCNNK0Dz6YJBr1rhmx9G7gTTcsNNOMrezc0LslWwPlVdV+4+Uthn810Fr2hHdFxLdDXoWb/leRfgN+l6ZC4Hb2d5GpJM4RNLYP0IiT5JL8e1K6n6RX9L1X188nXakleMbK/PfAUepp9VtXPktwZ2K+qjktyG2Drruu1SLsCF7WT5UDT8eo/ZyaeqaondFazLdAO7ft6kjtV1Q+6rs8YfKptuv97mslaoGn27qM/pFkc4s1VdV078cZfdlwnjTBIL86lNEMVZp4VPh24CrgHzdCs/9FRvRal7a066swkvcx2kvwpzeoyuwB3o5kX95+ZZThED7y26wqM0Z7Ahe0XjtF+HL34ogGQ5EE0Izve0B7vQPP44VvA27qs25ZKsmNV/ZjmS/kZbdkuwE00yUY/mUmr9cCqetjI8SeTfKmqHpbkws5qtUibzKG8FXAg8BsdVWepXkgzPOlsuHlmuN3nf8l0qqovJvkNms9TwFer6r87rtZiDeER0UyzMEkeRjNhxYuB/YGV9Gt88b/RTHM627StfZuuddAM0otzh9GmuyR3oplaE6CPw7JG/6OuA77HxqFYfXNTVf1iZlhMO+91L79etxOAvBb4As3fzbuSvL6q3tttzRblsVX1V6MFba/7PrXYbF1VM0sSPh1YWVUnACckWd1hvbZYVT2+ncb4dwbyCKJhJq3WXwBfSfJdml+edwFekOS2bFxDtDeGNL8y8MUkrwJuneRRwAuAT3Zcp8X6S5pWm2sA2ukb/wPoY5B+FPBXm5Q9ZpayabZ1kltV1TqaxyejqyH17ndpVVWSE2lazjSlevcPaxpU1WfSrIJ1L5og/a2RzmJv765mi5PkybMUXw9cUFVXz3Jumh1N0wpwAfA8mrly+9qpZw3NYiczfsKvrkU79ZI8n+aL0t3yq8s53o7mC0effIjmS+Bamh7eXwZIcnf6O9/1cFaNG+gQrNQAmwcmoZ1Uf19GvuhU1fGdVWgJknwaOISNE5gcBpxF0xHu9X2Y6nRAPYdvluR44H40K+gUzfCyc2iWE+zF5Dnt/Na3B/6W5gvUjJ+MNB33RrsAxZ7AKVX1s7bsHsAOfZylK8k3af6f937VuJ2226N+647jX27gs5e97dxFLFU5NmbSi5DkAzQ9h1ezcTKAAnoZpGnWjrl3VV0FzULwbJx16EtsXDd3mn0COAAgyQlV9ZSO6zMO3223GTPL3fVmAp12+tLrk2zarL1Dkh369sWqqs6apeziLuoyJoNaNW6IDNKLcxBwnxpOM8S+MwG6dTVwj6q6NklfJtsf7Z06iJ6pA5s059Ns7Jy4PU0/jm/Tz4VPBqOqvp/kAGBmfvgz+9gicLPB/EreyCC9ON+gGaI072LdPfLlJJ8CPtYePxX4UtsR7rruqrVFao793kpyB+B/0i7kMFPex5ngqup+o8dtYHheR9VRK8lradY6npnW9H1JPlZVf9NhtTTCIL04uwHfbCdmuKktq6o6vMM6LcULgSfTfJsOTQ/1E9qWgod3WbEt8IAkP6ap/63bfdj4jG3H7qq2aB8EPkIznvXPgCOAH3ZaozFpl0h8UNf1EM+kGUHwc4AkbwTOA/oXpAfaccwgvTjHjOyHJrj1dTlEaFb0+URVndAuH3hPmn8bfWnqpqr6OvXnfHatqmOTvHRmEZQezwT38pHDrWj6DwziC0fPXUbTSjMzOmU7frUfRL/Y3C24eSao/YFn0cx9+z2aqSf76kvAbye5PfB5mmkBn06zQIW6M/Ml6cp2/eIrgL07rM9SjHZ2W0fzjPqEjuqijW6ima71VJpc9FE0c0C8E6Cq+roa3mAYpLdAO9TiGTRZ8zU0TZGpqr40Cc8lVXVDkqOAd1XV3yf5WteVEn/TDmH6C+BdwI7An3dbpcWZ6QSX5LYzQ5c0FU5stxlndFSP8TCTvsX7Fs0EBn9QVZcAJOnlL81NJMkhNJnzzHSg/tvoWFV9qt29nv70DZhV++/rWGAH4E5JHgA8r6pe0G3NbrmSbA08qqr+qOu6aG7+It4yT6HJpE9P8lngw/zq0J++ehnwSuDEqrowyV3ZOLGJJizJu5inh3pPmyDfDvw+MLPM5tfbRSrUkapan+QOSbatqj6uObAJ15O+xauqE4ET26FJT6RpetwjybtpAtwpnVZwkWY6JY0cXwr0MRAMxehSgX8NvK6rioxTVV0+s/BJa/1c12piLqNZmvZkfnUJ0amfze7XFLBhQ9e1GDuD9CK0z9Q+CHywXebxaTRTHvYqSCd5e1W9LMknmSVz69Nav0NSVTcv0pLkZaPHPXZ5O5VuJdmW5kvgRR3XSU1nxCtoetz3Zia7WxKD9BK18w//S7v1zcx0n2/utBaaz1Da7/4MeAewF83CIafQjM9XhwY2q53N3RqWqjq3/fnFdnYrqsqxqxq7qlqLQ/qmTpLTmb0VrXez2g2VQfoWrF30/XXAi2g6wG2VZB3NMKzXd1q5W7AkP2HjL87b9Hn2tHbayblUVb1hYpXRbF4xsr89TefYdR3VZenMpDUwLwMeCjyoqr4H0PbsfneSP6+qt3Vau1uoqhrSs8HZxkTflmao366AQbpDM61pI87s66x2Q2WQvmV7Ds04ybUzBVV1aZI/onlmaJDWklTVW2b2k9wOeClwJM3wxbfM9TpNRtvxdcZWNCv8/UZH1Vmicu5uDc42owF6RlX9MMk2XVRIw9MGgpfTPJM+Djigqn7Uba3UOpeNj1bW0QzJOmrOq6dZQZVDsDQs801gMIDJDdS1JG+iWWFtJXC/qvppx1US0K5AdnlV3aU9PoLmefRlwDc7rJo2sVXXFVCnHpDkx7NsPwHut+CrpYX9BXBH4DXAFaP/xkY6xGny/oX2i3g789vf0rRyXE/zhaqfNtT4t46ZSd+CDXR5R02RqjIRmE5bt3M8QLPi3cqqOgE4IcnqDuulTfgfSJJuebZOMpOkPRL4wsi5/iZvVePfOtbfvwxJ0mJ9CPhikrXAjTSr+5Hk7jRN3v1T5dzdkqT+q6r/neQ0YE/glKqbU8atgBd3VzNtyiAtSbdAVXXWLGUXd1GXsZmC5ulx85m0JElTykxakjQI5TNpSZKm0XT0xh43m7slSZpSZtKSpP4rpmKGsHEzk5YkaUqZSUuShmGAq2CZSUuSNKXMpCVJvVdADfCZtEFaktR/VTZ3S5KkjZI8Osm3k1yS5Ohx399MWpI0CJNu7k6yNfCPwKOANcBXk5xcVd8c13uYSUuStDgPBi6pqkur6hfAh4HDx/kGZtKSpGGY/DPpvYDLR47XAA8Z5xsYpCVJvfcTfvS5z9fHd1uGW2+fZNXI8cqqWtnuZ5brx9rmbpCWJPVeVT26g7ddA+wzcrw3cMU438Bn0pIkLc5Xgf2S3CXJtsAzgJPH+QZm0pIkLUJVrUvyIuBzwNbAe6vqwnG+R2qA629KkjQENndLkjSlDNKSJE0pg7QkSVPKIC1J0pQySEuSNKUM0pIkTSmDtCRJU8ogLUnSlPr/Z6bR2NdNyw4AAAAASUVORK5CYII=\n",
|
| 387 |
-
"text/plain": [
|
| 388 |
-
"<Figure size 576x576 with 2 Axes>"
|
| 389 |
-
]
|
| 390 |
-
},
|
| 391 |
-
"metadata": {
|
| 392 |
-
"needs_background": "light"
|
| 393 |
-
},
|
| 394 |
-
"output_type": "display_data"
|
| 395 |
-
}
|
| 396 |
-
],
|
| 397 |
-
"source": [
|
| 398 |
-
"import matplotlib.pyplot as plt\n",
|
| 399 |
-
"import sklearn\n",
|
| 400 |
-
"from sklearn.metrics import classification_report, confusion_matrix\n",
|
| 401 |
-
"import numpy as np\n",
|
| 402 |
-
"\n",
|
| 403 |
-
"nb_train_samples = 28273\n",
|
| 404 |
-
"nb_validation_samples = 3534\n",
|
| 405 |
-
"\n",
|
| 406 |
-
"# We need to recreate our validation generator with shuffle = false\n",
|
| 407 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 408 |
-
" validation_data_dir,\n",
|
| 409 |
-
" color_mode = 'grayscale',\n",
|
| 410 |
-
" target_size=(img_rows, img_cols),\n",
|
| 411 |
-
" batch_size=batch_size,\n",
|
| 412 |
-
" class_mode='categorical',\n",
|
| 413 |
-
" shuffle=False)\n",
|
| 414 |
-
"\n",
|
| 415 |
-
"class_labels = validation_generator.class_indices\n",
|
| 416 |
-
"class_labels = {v: k for k, v in class_labels.items()}\n",
|
| 417 |
-
"classes = list(class_labels.values())\n",
|
| 418 |
-
"\n",
|
| 419 |
-
"#Confution Matrix and Classification Report\n",
|
| 420 |
-
"Y_pred = model.predict(validation_generator)\n",
|
| 421 |
-
"y_pred = np.argmax(Y_pred, axis=1)\n",
|
| 422 |
-
"\n",
|
| 423 |
-
"print('Confusion Matrix')\n",
|
| 424 |
-
"print(confusion_matrix(validation_generator.classes, y_pred))\n",
|
| 425 |
-
"print('Classification Report')\n",
|
| 426 |
-
"target_names = list(class_labels.values())\n",
|
| 427 |
-
"print(classification_report(validation_generator.classes, y_pred, target_names=target_names))\n",
|
| 428 |
-
"\n",
|
| 429 |
-
"plt.figure(figsize=(8,8))\n",
|
| 430 |
-
"cnf_matrix = confusion_matrix(validation_generator.classes, y_pred)\n",
|
| 431 |
-
"\n",
|
| 432 |
-
"plt.imshow(cnf_matrix, interpolation='nearest')\n",
|
| 433 |
-
"plt.colorbar()\n",
|
| 434 |
-
"tick_marks = np.arange(len(classes))\n",
|
| 435 |
-
"_ = plt.xticks(tick_marks, classes, rotation=90)\n",
|
| 436 |
-
"_ = plt.yticks(tick_marks, classes)"
|
| 437 |
-
]
|
| 438 |
-
},
|
| 439 |
-
{
|
| 440 |
-
"cell_type": "markdown",
|
| 441 |
-
"metadata": {},
|
| 442 |
-
"source": [
|
| 443 |
-
"### Loading our saved model"
|
| 444 |
-
]
|
| 445 |
-
},
|
| 446 |
-
{
|
| 447 |
-
"cell_type": "code",
|
| 448 |
-
"execution_count": 20,
|
| 449 |
-
"metadata": {},
|
| 450 |
-
"outputs": [],
|
| 451 |
-
"source": [
|
| 452 |
-
"from tensorflow.keras.models import load_model\n",
|
| 453 |
-
"\n",
|
| 454 |
-
"classifier = load_model('emotion_little_vgg.h5')"
|
| 455 |
-
]
|
| 456 |
-
},
|
| 457 |
-
{
|
| 458 |
-
"cell_type": "markdown",
|
| 459 |
-
"metadata": {},
|
| 460 |
-
"source": [
|
| 461 |
-
"### Get our class labels"
|
| 462 |
-
]
|
| 463 |
-
},
|
| 464 |
-
{
|
| 465 |
-
"cell_type": "code",
|
| 466 |
-
"execution_count": 21,
|
| 467 |
-
"metadata": {},
|
| 468 |
-
"outputs": [
|
| 469 |
-
{
|
| 470 |
-
"name": "stdout",
|
| 471 |
-
"output_type": "stream",
|
| 472 |
-
"text": [
|
| 473 |
-
"Found 3589 images belonging to 7 classes.\n",
|
| 474 |
-
"{0: 'Angry', 1: 'Disgust', 2: 'Fear', 3: 'Happy', 4: 'Neutral', 5: 'Sad', 6: 'Surprise'}\n"
|
| 475 |
-
]
|
| 476 |
-
}
|
| 477 |
-
],
|
| 478 |
-
"source": [
|
| 479 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 480 |
-
" validation_data_dir,\n",
|
| 481 |
-
" color_mode = 'grayscale',\n",
|
| 482 |
-
" target_size=(img_rows, img_cols),\n",
|
| 483 |
-
" batch_size=batch_size,\n",
|
| 484 |
-
" class_mode='categorical',\n",
|
| 485 |
-
" shuffle=False)\n",
|
| 486 |
-
"\n",
|
| 487 |
-
"class_labels = validation_generator.class_indices\n",
|
| 488 |
-
"class_labels = {v: k for k, v in class_labels.items()}\n",
|
| 489 |
-
"classes = list(class_labels.values())\n",
|
| 490 |
-
"print(class_labels)"
|
| 491 |
-
]
|
| 492 |
-
},
|
| 493 |
-
{
|
| 494 |
-
"cell_type": "markdown",
|
| 495 |
-
"metadata": {},
|
| 496 |
-
"source": [
|
| 497 |
-
"### Let's test on some of validation images"
|
| 498 |
-
]
|
| 499 |
-
},
|
| 500 |
-
{
|
| 501 |
-
"cell_type": "code",
|
| 502 |
-
"execution_count": 25,
|
| 503 |
-
"metadata": {},
|
| 504 |
-
"outputs": [],
|
| 505 |
-
"source": [
|
| 506 |
-
"from tensorflow.keras.models import load_model\n",
|
| 507 |
-
"from tensorflow.keras.optimizers import RMSprop, SGD, Adam\n",
|
| 508 |
-
"from tensorflow.keras.preprocessing import image\n",
|
| 509 |
-
"import numpy as np\n",
|
| 510 |
-
"import os\n",
|
| 511 |
-
"import cv2\n",
|
| 512 |
-
"import numpy as np\n",
|
| 513 |
-
"from os import listdir\n",
|
| 514 |
-
"from os.path import isfile, join\n",
|
| 515 |
-
"import re\n",
|
| 516 |
-
"\n",
|
| 517 |
-
"def draw_test(name, pred, im, true_label):\n",
|
| 518 |
-
" BLACK = [0,0,0]\n",
|
| 519 |
-
" expanded_image = cv2.copyMakeBorder(im, 160, 0, 0, 300 ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 520 |
-
" cv2.putText(expanded_image, \"predited - \"+ pred, (20, 60) , cv2.FONT_HERSHEY_SIMPLEX,1, (0,0,255), 2)\n",
|
| 521 |
-
" cv2.putText(expanded_image, \"true - \"+ true_label, (20, 120) , cv2.FONT_HERSHEY_SIMPLEX,1, (0,255,0), 2)\n",
|
| 522 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 523 |
-
"\n",
|
| 524 |
-
"\n",
|
| 525 |
-
"def getRandomImage(path, img_width, img_height):\n",
|
| 526 |
-
" \"\"\"function loads a random images from a random folder in our test path \"\"\"\n",
|
| 527 |
-
" folders = list(filter(lambda x: os.path.isdir(os.path.join(path, x)), os.listdir(path)))\n",
|
| 528 |
-
" random_directory = np.random.randint(0,len(folders))\n",
|
| 529 |
-
" path_class = folders[random_directory]\n",
|
| 530 |
-
" file_path = path + path_class\n",
|
| 531 |
-
" file_names = [f for f in listdir(file_path) if isfile(join(file_path, f))]\n",
|
| 532 |
-
" random_file_index = np.random.randint(0,len(file_names))\n",
|
| 533 |
-
" image_name = file_names[random_file_index]\n",
|
| 534 |
-
" final_path = file_path + \"/\" + image_name\n",
|
| 535 |
-
" return image.load_img(final_path, target_size = (img_width, img_height),grayscale=True), final_path, path_class\n",
|
| 536 |
-
"\n",
|
| 537 |
-
"# dimensions of our images\n",
|
| 538 |
-
"img_width, img_height = 48, 48\n",
|
| 539 |
-
"\n",
|
| 540 |
-
"# We use a very small learning rate \n",
|
| 541 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 542 |
-
" optimizer = RMSprop(lr = 0.001),\n",
|
| 543 |
-
" metrics = ['accuracy'])\n",
|
| 544 |
-
"\n",
|
| 545 |
-
"files = []\n",
|
| 546 |
-
"predictions = []\n",
|
| 547 |
-
"true_labels = []\n",
|
| 548 |
-
"\n",
|
| 549 |
-
"# predicting images\n",
|
| 550 |
-
"for i in range(0, 10):\n",
|
| 551 |
-
" path = './fer2013/validation/' \n",
|
| 552 |
-
" img, final_path, true_label = getRandomImage(path, img_width, img_height)\n",
|
| 553 |
-
" files.append(final_path)\n",
|
| 554 |
-
" true_labels.append(true_label)\n",
|
| 555 |
-
" x = image.img_to_array(img)\n",
|
| 556 |
-
" x = x * 1./255\n",
|
| 557 |
-
" x = np.expand_dims(x, axis=0)\n",
|
| 558 |
-
" images = np.vstack([x])\n",
|
| 559 |
-
" classes = model.predict_classes(images, batch_size = 10)\n",
|
| 560 |
-
" predictions.append(classes)\n",
|
| 561 |
-
" \n",
|
| 562 |
-
"for i in range(0, len(files)):\n",
|
| 563 |
-
" image = cv2.imread((files[i]))\n",
|
| 564 |
-
" image = cv2.resize(image, None, fx=3, fy=3, interpolation = cv2.INTER_CUBIC)\n",
|
| 565 |
-
" draw_test(\"Prediction\", class_labels[predictions[i][0]], image, true_labels[i])\n",
|
| 566 |
-
" cv2.waitKey(0)\n",
|
| 567 |
-
"\n",
|
| 568 |
-
"cv2.destroyAllWindows()"
|
| 569 |
-
]
|
| 570 |
-
},
|
| 571 |
-
{
|
| 572 |
-
"cell_type": "markdown",
|
| 573 |
-
"metadata": {},
|
| 574 |
-
"source": [
|
| 575 |
-
"### Test on a single image"
|
| 576 |
-
]
|
| 577 |
-
},
|
| 578 |
-
{
|
| 579 |
-
"cell_type": "code",
|
| 580 |
-
"execution_count": 27,
|
| 581 |
-
"metadata": {},
|
| 582 |
-
"outputs": [],
|
| 583 |
-
"source": [
|
| 584 |
-
"from tensorflow.keras.models import load_model\n",
|
| 585 |
-
"from tensorflow.keras.preprocessing import image\n",
|
| 586 |
-
"import numpy as np\n",
|
| 587 |
-
"import os\n",
|
| 588 |
-
"import cv2\n",
|
| 589 |
-
"import numpy as np\n",
|
| 590 |
-
"from os import listdir\n",
|
| 591 |
-
"from os.path import isfile, join\n",
|
| 592 |
-
"from tensorflow.keras.preprocessing.image import img_to_array\n",
|
| 593 |
-
"\n",
|
| 594 |
-
"face_classifier = cv2.CascadeClassifier('./Haarcascades/haarcascade_frontalface_default.xml')\n",
|
| 595 |
-
"\n",
|
| 596 |
-
"def face_detector(img):\n",
|
| 597 |
-
" # Convert image to grayscale\n",
|
| 598 |
-
" gray = cv2.cvtColor(img.copy(),cv2.COLOR_BGR2GRAY)\n",
|
| 599 |
-
" faces = face_classifier.detectMultiScale(gray, 1.3, 5)\n",
|
| 600 |
-
" if faces is ():\n",
|
| 601 |
-
" return (0,0,0,0), np.zeros((48,48), np.uint8), img\n",
|
| 602 |
-
" \n",
|
| 603 |
-
" allfaces = [] \n",
|
| 604 |
-
" rects = []\n",
|
| 605 |
-
" for (x,y,w,h) in faces:\n",
|
| 606 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2)\n",
|
| 607 |
-
" roi_gray = gray[y:y+h, x:x+w]\n",
|
| 608 |
-
" roi_gray = cv2.resize(roi_gray, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 609 |
-
" allfaces.append(roi_gray)\n",
|
| 610 |
-
" rects.append((x,w,y,h))\n",
|
| 611 |
-
" return rects, allfaces, img\n",
|
| 612 |
-
"\n",
|
| 613 |
-
"img = cv2.imread(\"rajeev.jpg\")\n",
|
| 614 |
-
"rects, faces, image = face_detector(img)\n",
|
| 615 |
-
"\n",
|
| 616 |
-
"i = 0\n",
|
| 617 |
-
"for face in faces:\n",
|
| 618 |
-
" roi = face.astype(\"float\") / 255.0\n",
|
| 619 |
-
" roi = img_to_array(roi)\n",
|
| 620 |
-
" roi = np.expand_dims(roi, axis=0)\n",
|
| 621 |
-
"\n",
|
| 622 |
-
" # make a prediction on the ROI, then lookup the class\n",
|
| 623 |
-
" preds = classifier.predict(roi)[0]\n",
|
| 624 |
-
" label = class_labels[preds.argmax()] \n",
|
| 625 |
-
"\n",
|
| 626 |
-
" #Overlay our detected emotion on our pic\n",
|
| 627 |
-
" label_position = (rects[i][0] + int((rects[i][1]/2)), abs(rects[i][2] - 10))\n",
|
| 628 |
-
" i =+ 1\n",
|
| 629 |
-
" cv2.putText(image, label, label_position , cv2.FONT_HERSHEY_SIMPLEX,1, (0,255,0), 2)\n",
|
| 630 |
-
" \n",
|
| 631 |
-
"cv2.imshow(\"Emotion Detector\", image)\n",
|
| 632 |
-
"cv2.waitKey(0)\n",
|
| 633 |
-
"\n",
|
| 634 |
-
"cv2.destroyAllWindows()"
|
| 635 |
-
]
|
| 636 |
-
},
|
| 637 |
-
{
|
| 638 |
-
"cell_type": "markdown",
|
| 639 |
-
"metadata": {},
|
| 640 |
-
"source": [
|
| 641 |
-
"### Let's try this on our webcam\n"
|
| 642 |
-
]
|
| 643 |
-
},
|
| 644 |
-
{
|
| 645 |
-
"cell_type": "code",
|
| 646 |
-
"execution_count": 29,
|
| 647 |
-
"metadata": {},
|
| 648 |
-
"outputs": [],
|
| 649 |
-
"source": [
|
| 650 |
-
"import cv2\n",
|
| 651 |
-
"import numpy as np\n",
|
| 652 |
-
"from time import sleep\n",
|
| 653 |
-
"from tensorflow.keras.preprocessing.image import img_to_array\n",
|
| 654 |
-
"\n",
|
| 655 |
-
"face_classifier = cv2.CascadeClassifier('./Haarcascades/haarcascade_frontalface_default.xml')\n",
|
| 656 |
-
"\n",
|
| 657 |
-
"def face_detector(img):\n",
|
| 658 |
-
" # Convert image to grayscale\n",
|
| 659 |
-
" gray = cv2.cvtColor(img,cv2.COLOR_BGR2GRAY)\n",
|
| 660 |
-
" faces = face_classifier.detectMultiScale(gray, 1.3, 5)\n",
|
| 661 |
-
" if faces is ():\n",
|
| 662 |
-
" return (0,0,0,0), np.zeros((48,48), np.uint8), img\n",
|
| 663 |
-
" \n",
|
| 664 |
-
" for (x,y,w,h) in faces:\n",
|
| 665 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2)\n",
|
| 666 |
-
" roi_gray = gray[y:y+h, x:x+w]\n",
|
| 667 |
-
"\n",
|
| 668 |
-
" try:\n",
|
| 669 |
-
" roi_gray = cv2.resize(roi_gray, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 670 |
-
" except:\n",
|
| 671 |
-
" return (x,w,y,h), np.zeros((48,48), np.uint8), img\n",
|
| 672 |
-
" return (x,w,y,h), roi_gray, img\n",
|
| 673 |
-
"\n",
|
| 674 |
-
"cap = cv2.VideoCapture(0)\n",
|
| 675 |
-
"\n",
|
| 676 |
-
"while True:\n",
|
| 677 |
-
"\n",
|
| 678 |
-
" ret, frame = cap.read()\n",
|
| 679 |
-
" rect, face, image = face_detector(frame)\n",
|
| 680 |
-
" if np.sum([face]) != 0.0:\n",
|
| 681 |
-
" roi = face.astype(\"float\") / 255.0\n",
|
| 682 |
-
" roi = img_to_array(roi)\n",
|
| 683 |
-
" roi = np.expand_dims(roi, axis=0)\n",
|
| 684 |
-
"\n",
|
| 685 |
-
" # make a prediction on the ROI, then lookup the class\n",
|
| 686 |
-
" preds = classifier.predict(roi)[0]\n",
|
| 687 |
-
" label = class_labels[preds.argmax()] \n",
|
| 688 |
-
" label_position = (rect[0] + int((rect[1]/2)), rect[2] + 25)\n",
|
| 689 |
-
" cv2.putText(image, label, label_position , cv2.FONT_HERSHEY_SIMPLEX,2, (0,255,0), 3)\n",
|
| 690 |
-
" else:\n",
|
| 691 |
-
" cv2.putText(image, \"No Face Found\", (20, 60) , cv2.FONT_HERSHEY_SIMPLEX,2, (0,255,0), 3)\n",
|
| 692 |
-
" \n",
|
| 693 |
-
" cv2.imshow('All', image)\n",
|
| 694 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 695 |
-
" break\n",
|
| 696 |
-
" \n",
|
| 697 |
-
"cap.release()\n",
|
| 698 |
-
"cv2.destroyAllWindows() "
|
| 699 |
-
]
|
| 700 |
-
}
|
| 701 |
-
],
|
| 702 |
-
"metadata": {
|
| 703 |
-
"kernelspec": {
|
| 704 |
-
"display_name": "Python 3",
|
| 705 |
-
"language": "python",
|
| 706 |
-
"name": "python3"
|
| 707 |
-
},
|
| 708 |
-
"language_info": {
|
| 709 |
-
"codemirror_mode": {
|
| 710 |
-
"name": "ipython",
|
| 711 |
-
"version": 3
|
| 712 |
-
},
|
| 713 |
-
"file_extension": ".py",
|
| 714 |
-
"mimetype": "text/x-python",
|
| 715 |
-
"name": "python",
|
| 716 |
-
"nbconvert_exporter": "python",
|
| 717 |
-
"pygments_lexer": "ipython3",
|
| 718 |
-
"version": "3.7.4"
|
| 719 |
-
}
|
| 720 |
-
},
|
| 721 |
-
"nbformat": 4,
|
| 722 |
-
"nbformat_minor": 2
|
| 723 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:9f82a4b31aea8dbfdcd71cadb1d6df5fa1be4dcb41d21aaaaa82d632437fa5bb
|
| 3 |
+
size 44387
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/18.3A - Age, Gender Detection.ipynb
CHANGED
|
@@ -1,174 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Let's run our Age and Gender Detector\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"- Please see https://github.com/yu4u/age-gender-estimation for source code project.\n",
|
| 10 |
-
"- In this notebook we re-use the model trained by yu4u"
|
| 11 |
-
]
|
| 12 |
-
},
|
| 13 |
-
{
|
| 14 |
-
"cell_type": "code",
|
| 15 |
-
"execution_count": null,
|
| 16 |
-
"metadata": {},
|
| 17 |
-
"outputs": [],
|
| 18 |
-
"source": []
|
| 19 |
-
},
|
| 20 |
-
{
|
| 21 |
-
"cell_type": "code",
|
| 22 |
-
"execution_count": 5,
|
| 23 |
-
"metadata": {},
|
| 24 |
-
"outputs": [
|
| 25 |
-
{
|
| 26 |
-
"name": "stdout",
|
| 27 |
-
"output_type": "stream",
|
| 28 |
-
"text": [
|
| 29 |
-
"Downloading data from https://github.com/yu4u/age-gender-estimation/releases/download/v0.5/weights.28-3.73.hdf5\n",
|
| 30 |
-
"195854336/195848088 [==============================] - 173s 1us/step\n"
|
| 31 |
-
]
|
| 32 |
-
}
|
| 33 |
-
],
|
| 34 |
-
"source": [
|
| 35 |
-
"from pathlib import Path\n",
|
| 36 |
-
"import cv2\n",
|
| 37 |
-
"import dlib\n",
|
| 38 |
-
"import sys\n",
|
| 39 |
-
"import numpy as np\n",
|
| 40 |
-
"import argparse\n",
|
| 41 |
-
"from contextlib import contextmanager\n",
|
| 42 |
-
"from wide_resnet import WideResNet\n",
|
| 43 |
-
"from tensorflow.keras.utils import get_file\n",
|
| 44 |
-
"\n",
|
| 45 |
-
"# Load our cassade classifier for faces\n",
|
| 46 |
-
"face_classifier = cv2.CascadeClassifier('./Haarcascades/haarcascade_frontalface_default.xml')\n",
|
| 47 |
-
"\n",
|
| 48 |
-
"# Load our pretrained model for Gender and Age Detection\n",
|
| 49 |
-
"pretrained_model = \"https://github.com/yu4u/age-gender-estimation/releases/download/v0.5/weights.28-3.73.hdf5\"\n",
|
| 50 |
-
"modhash = 'fbe63257a054c1c5466cfd7bf14646d6'\n",
|
| 51 |
-
"\n",
|
| 52 |
-
"# Face Detection function\n",
|
| 53 |
-
"def face_detector(img):\n",
|
| 54 |
-
" # Convert image to grayscale for faster detection\n",
|
| 55 |
-
" gray = cv2.cvtColor(img.copy(),cv2.COLOR_BGR2GRAY)\n",
|
| 56 |
-
" faces = face_classifier.detectMultiScale(gray, 1.3, 5)\n",
|
| 57 |
-
" if faces is ():\n",
|
| 58 |
-
" return False ,(0,0,0,0), np.zeros((1,48,48,3), np.uint8), img\n",
|
| 59 |
-
" \n",
|
| 60 |
-
" allfaces = [] \n",
|
| 61 |
-
" rects = []\n",
|
| 62 |
-
" for (x,y,w,h) in faces:\n",
|
| 63 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2)\n",
|
| 64 |
-
" roi = img[y:y+h, x:x+w]\n",
|
| 65 |
-
" roi_groiray = cv2.resize(roi, (64, 64), interpolation = cv2.INTER_AREA)\n",
|
| 66 |
-
" allfaces.append(roi)\n",
|
| 67 |
-
" rects.append((x,w,y,h))\n",
|
| 68 |
-
" return True, rects, allfaces, img\n",
|
| 69 |
-
"\n",
|
| 70 |
-
"# Define our model parameters\n",
|
| 71 |
-
"depth = 16\n",
|
| 72 |
-
"k = 8\n",
|
| 73 |
-
"weight_file = None\n",
|
| 74 |
-
"margin = 0.4\n",
|
| 75 |
-
"image_dir = None\n",
|
| 76 |
-
"\n",
|
| 77 |
-
"# Get our weight file \n",
|
| 78 |
-
"if not weight_file:\n",
|
| 79 |
-
" weight_file = get_file(\"weights.28-3.73.hdf5\", pretrained_model, cache_subdir=\"pretrained_models\",\n",
|
| 80 |
-
" file_hash=modhash, cache_dir=Path(sys.argv[0]).resolve().parent)\n",
|
| 81 |
-
"\n",
|
| 82 |
-
"# load model and weights\n",
|
| 83 |
-
"img_size = 64\n",
|
| 84 |
-
"model = WideResNet(img_size, depth=depth, k=k)()\n",
|
| 85 |
-
"model.load_weights(weight_file)\n",
|
| 86 |
-
"\n",
|
| 87 |
-
"# Initialize Webcam\n",
|
| 88 |
-
"cap = cv2.VideoCapture(0)\n",
|
| 89 |
-
"\n",
|
| 90 |
-
"while True:\n",
|
| 91 |
-
" ret, frame = cap.read()\n",
|
| 92 |
-
" ret, rects, faces, image = face_detector(frame)\n",
|
| 93 |
-
" preprocessed_faces = []\n",
|
| 94 |
-
" i = 0\n",
|
| 95 |
-
" if ret:\n",
|
| 96 |
-
" for (i,face) in enumerate(faces):\n",
|
| 97 |
-
" face = cv2.resize(face, (64, 64), interpolation = cv2.INTER_AREA)\n",
|
| 98 |
-
" preprocessed_faces.append(face)\n",
|
| 99 |
-
"\n",
|
| 100 |
-
" # make a prediction on the faces detected\n",
|
| 101 |
-
" results = model.predict(np.array(preprocessed_faces))\n",
|
| 102 |
-
" predicted_genders = results[0]\n",
|
| 103 |
-
" ages = np.arange(0, 101).reshape(101, 1)\n",
|
| 104 |
-
" predicted_ages = results[1].dot(ages).flatten()\n",
|
| 105 |
-
"\n",
|
| 106 |
-
" # draw results\n",
|
| 107 |
-
" for (i, f) in enumerate(faces):\n",
|
| 108 |
-
" label = \"{}, {}\".format(int(predicted_ages[i]),\n",
|
| 109 |
-
" \"F\" if predicted_genders[i][0] > 0.5 else \"M\")\n",
|
| 110 |
-
"\n",
|
| 111 |
-
" #Overlay our detected emotion on our pic\n",
|
| 112 |
-
" label_position = (rects[i][0] + int((rects[i][1]/2)), abs(rects[i][2] - 10))\n",
|
| 113 |
-
" i =+ 1\n",
|
| 114 |
-
" cv2.putText(image, label, label_position , cv2.FONT_HERSHEY_SIMPLEX,1, (0,255,0), 2)\n",
|
| 115 |
-
"\n",
|
| 116 |
-
" cv2.imshow(\"Emotion Detector\", image)\n",
|
| 117 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 118 |
-
" break\n",
|
| 119 |
-
"\n",
|
| 120 |
-
"cap.release()\n",
|
| 121 |
-
"cv2.destroyAllWindows() "
|
| 122 |
-
]
|
| 123 |
-
},
|
| 124 |
-
{
|
| 125 |
-
"cell_type": "markdown",
|
| 126 |
-
"metadata": {},
|
| 127 |
-
"source": [
|
| 128 |
-
"## Note if you get the following error, you need to enable your webcam \n",
|
| 129 |
-
"<img src=\"error.jpg\">\n",
|
| 130 |
-
"### Enable your webcam by doing the following:\n",
|
| 131 |
-
"<img src=\"webcam.jpg\">"
|
| 132 |
-
]
|
| 133 |
-
},
|
| 134 |
-
{
|
| 135 |
-
"cell_type": "code",
|
| 136 |
-
"execution_count": 2,
|
| 137 |
-
"metadata": {},
|
| 138 |
-
"outputs": [],
|
| 139 |
-
"source": [
|
| 140 |
-
"# Run these lines if your webcam fails to be realesed due to error in code\n",
|
| 141 |
-
"cap.release()\n",
|
| 142 |
-
"cv2.destroyAllWindows()"
|
| 143 |
-
]
|
| 144 |
-
},
|
| 145 |
-
{
|
| 146 |
-
"cell_type": "code",
|
| 147 |
-
"execution_count": null,
|
| 148 |
-
"metadata": {},
|
| 149 |
-
"outputs": [],
|
| 150 |
-
"source": []
|
| 151 |
-
}
|
| 152 |
-
],
|
| 153 |
-
"metadata": {
|
| 154 |
-
"kernelspec": {
|
| 155 |
-
"display_name": "Python 3",
|
| 156 |
-
"language": "python",
|
| 157 |
-
"name": "python3"
|
| 158 |
-
},
|
| 159 |
-
"language_info": {
|
| 160 |
-
"codemirror_mode": {
|
| 161 |
-
"name": "ipython",
|
| 162 |
-
"version": 3
|
| 163 |
-
},
|
| 164 |
-
"file_extension": ".py",
|
| 165 |
-
"mimetype": "text/x-python",
|
| 166 |
-
"name": "python",
|
| 167 |
-
"nbconvert_exporter": "python",
|
| 168 |
-
"pygments_lexer": "ipython3",
|
| 169 |
-
"version": "3.7.4"
|
| 170 |
-
}
|
| 171 |
-
},
|
| 172 |
-
"nbformat": 4,
|
| 173 |
-
"nbformat_minor": 2
|
| 174 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:dbc2fe2b91fac48067abea2d97fc9e34b5fc9aae290774fc24e4612bc83eb43e
|
| 3 |
+
size 5569
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/18.3B Age, Gender with Emotion.ipynb
CHANGED
|
@@ -1,526 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Age, Gender and Emotion Detection\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"### Let's load our classfiers"
|
| 10 |
-
]
|
| 11 |
-
},
|
| 12 |
-
{
|
| 13 |
-
"cell_type": "code",
|
| 14 |
-
"execution_count": 3,
|
| 15 |
-
"metadata": {},
|
| 16 |
-
"outputs": [],
|
| 17 |
-
"source": [
|
| 18 |
-
"from pathlib import Path\n",
|
| 19 |
-
"import cv2\n",
|
| 20 |
-
"import dlib\n",
|
| 21 |
-
"import sys\n",
|
| 22 |
-
"import numpy as np\n",
|
| 23 |
-
"import argparse\n",
|
| 24 |
-
"from contextlib import contextmanager\n",
|
| 25 |
-
"from wide_resnet import WideResNet\n",
|
| 26 |
-
"from tensorflow.keras.utils import get_file\n",
|
| 27 |
-
"from tensorflow.keras.models import load_model\n",
|
| 28 |
-
"from tensorflow.keras.preprocessing.image import img_to_array\n",
|
| 29 |
-
"\n",
|
| 30 |
-
"classifier = load_model('emotion_little_vgg.h5')\n",
|
| 31 |
-
"face_classifier = cv2.CascadeClassifier('./Haarcascades/haarcascade_frontalface_default.xml')\n",
|
| 32 |
-
"pretrained_model = \"https://github.com/yu4u/age-gender-estimation/releases/download/v0.5/weights.28-3.73.hdf5\""
|
| 33 |
-
]
|
| 34 |
-
},
|
| 35 |
-
{
|
| 36 |
-
"cell_type": "markdown",
|
| 37 |
-
"metadata": {},
|
| 38 |
-
"source": [
|
| 39 |
-
"### Testing our Emotion, Age and Gender Detector - Using Webcam"
|
| 40 |
-
]
|
| 41 |
-
},
|
| 42 |
-
{
|
| 43 |
-
"cell_type": "code",
|
| 44 |
-
"execution_count": 4,
|
| 45 |
-
"metadata": {},
|
| 46 |
-
"outputs": [],
|
| 47 |
-
"source": [
|
| 48 |
-
"modhash = 'fbe63257a054c1c5466cfd7bf14646d6'\n",
|
| 49 |
-
"emotion_classes = {0: 'Angry', 1: 'Fear', 2: 'Happy', 3: 'Neutral', 4: 'Sad', 5: 'Surprise'}\n",
|
| 50 |
-
"\n",
|
| 51 |
-
"def face_detector(img):\n",
|
| 52 |
-
" # Convert image to grayscale for faster detection\n",
|
| 53 |
-
" gray = cv2.cvtColor(img.copy(),cv2.COLOR_BGR2GRAY)\n",
|
| 54 |
-
" faces = face_classifier.detectMultiScale(gray, 1.3, 5)\n",
|
| 55 |
-
" if faces is ():\n",
|
| 56 |
-
" return False ,(0,0,0,0), np.zeros((1,48,48,3), np.uint8), img\n",
|
| 57 |
-
" \n",
|
| 58 |
-
" allfaces = [] \n",
|
| 59 |
-
" rects = []\n",
|
| 60 |
-
" for (x,y,w,h) in faces:\n",
|
| 61 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2)\n",
|
| 62 |
-
" roi = img[y:y+h, x:x+w]\n",
|
| 63 |
-
" allfaces.append(roi)\n",
|
| 64 |
-
" rects.append((x,w,y,h))\n",
|
| 65 |
-
" return True, rects, allfaces, img\n",
|
| 66 |
-
"\n",
|
| 67 |
-
"# Define our model parameters\n",
|
| 68 |
-
"depth = 16\n",
|
| 69 |
-
"k = 8\n",
|
| 70 |
-
"weight_file = None\n",
|
| 71 |
-
"margin = 0.4\n",
|
| 72 |
-
"image_dir = None\n",
|
| 73 |
-
"\n",
|
| 74 |
-
"# Get our weight file \n",
|
| 75 |
-
"if not weight_file:\n",
|
| 76 |
-
" weight_file = get_file(\"weights.28-3.73.hdf5\", pretrained_model, cache_subdir=\"pretrained_models\",\n",
|
| 77 |
-
" file_hash=modhash, cache_dir=Path(sys.argv[0]).resolve().parent)\n",
|
| 78 |
-
"\n",
|
| 79 |
-
"# load model and weights\n",
|
| 80 |
-
"img_size = 64\n",
|
| 81 |
-
"model = WideResNet(img_size, depth=depth, k=k)()\n",
|
| 82 |
-
"model.load_weights(weight_file)\n",
|
| 83 |
-
"\n",
|
| 84 |
-
"# Initialize Webcam\n",
|
| 85 |
-
"cap = cv2.VideoCapture(0)\n",
|
| 86 |
-
"\n",
|
| 87 |
-
"while True:\n",
|
| 88 |
-
" ret, frame = cap.read()\n",
|
| 89 |
-
" ret, rects, faces, image = face_detector(frame)\n",
|
| 90 |
-
" preprocessed_faces_ag = []\n",
|
| 91 |
-
" preprocessed_faces_emo = []\n",
|
| 92 |
-
" \n",
|
| 93 |
-
" if ret:\n",
|
| 94 |
-
" for (i,face) in enumerate(faces):\n",
|
| 95 |
-
" face_ag = cv2.resize(face, (64, 64), interpolation = cv2.INTER_AREA)\n",
|
| 96 |
-
" preprocessed_faces_ag.append(face_ag)\n",
|
| 97 |
-
"\n",
|
| 98 |
-
" face_gray_emo = cv2.cvtColor(face, cv2.COLOR_BGR2GRAY)\n",
|
| 99 |
-
" face_gray_emo = cv2.resize(face_gray_emo, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 100 |
-
" face_gray_emo = face_gray_emo.astype(\"float\") / 255.0\n",
|
| 101 |
-
" face_gray_emo = img_to_array(face_gray_emo)\n",
|
| 102 |
-
" face_gray_emo = np.expand_dims(face_gray_emo, axis=0)\n",
|
| 103 |
-
" preprocessed_faces_emo.append(face_gray_emo)\n",
|
| 104 |
-
" \n",
|
| 105 |
-
" # make a prediction for Age and Gender\n",
|
| 106 |
-
" results = model.predict(np.array(preprocessed_faces_ag))\n",
|
| 107 |
-
" predicted_genders = results[0]\n",
|
| 108 |
-
" ages = np.arange(0, 101).reshape(101, 1)\n",
|
| 109 |
-
" predicted_ages = results[1].dot(ages).flatten()\n",
|
| 110 |
-
"\n",
|
| 111 |
-
" # make a prediction for Emotion \n",
|
| 112 |
-
" emo_labels = []\n",
|
| 113 |
-
" for (i, face) in enumerate(faces):\n",
|
| 114 |
-
" preds = classifier.predict(preprocessed_faces_emo[i])[0]\n",
|
| 115 |
-
" emo_labels.append(emotion_classes[preds.argmax()])\n",
|
| 116 |
-
" \n",
|
| 117 |
-
" # draw results, for Age and Gender\n",
|
| 118 |
-
" for (i, face) in enumerate(faces):\n",
|
| 119 |
-
" label = \"{}, {}, {}\".format(int(predicted_ages[i]),\n",
|
| 120 |
-
" \"F\" if predicted_genders[i][0] > 0.6 else \"M\",\n",
|
| 121 |
-
" emo_labels[i])\n",
|
| 122 |
-
" \n",
|
| 123 |
-
" #Overlay our detected emotion on our pic\n",
|
| 124 |
-
" for (i, face) in enumerate(faces):\n",
|
| 125 |
-
" label_position = (rects[i][0] + int((rects[i][1]/2)), abs(rects[i][2] - 10))\n",
|
| 126 |
-
" cv2.putText(image, label, label_position , cv2.FONT_HERSHEY_PLAIN,1, (0,255,0), 2)\n",
|
| 127 |
-
"\n",
|
| 128 |
-
" cv2.imshow(\"Emotion Detector\", image)\n",
|
| 129 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 130 |
-
" break\n",
|
| 131 |
-
"\n",
|
| 132 |
-
"cap.release()\n",
|
| 133 |
-
"cv2.destroyAllWindows() "
|
| 134 |
-
]
|
| 135 |
-
},
|
| 136 |
-
{
|
| 137 |
-
"cell_type": "code",
|
| 138 |
-
"execution_count": 4,
|
| 139 |
-
"metadata": {},
|
| 140 |
-
"outputs": [],
|
| 141 |
-
"source": [
|
| 142 |
-
"cap.release()\n",
|
| 143 |
-
"cv2.destroyAllWindows() "
|
| 144 |
-
]
|
| 145 |
-
},
|
| 146 |
-
{
|
| 147 |
-
"cell_type": "markdown",
|
| 148 |
-
"metadata": {},
|
| 149 |
-
"source": [
|
| 150 |
-
"### Testing our Emotion, Age and Gender Detector - On Images"
|
| 151 |
-
]
|
| 152 |
-
},
|
| 153 |
-
{
|
| 154 |
-
"cell_type": "code",
|
| 155 |
-
"execution_count": 5,
|
| 156 |
-
"metadata": {},
|
| 157 |
-
"outputs": [
|
| 158 |
-
{
|
| 159 |
-
"ename": "FileNotFoundError",
|
| 160 |
-
"evalue": "[WinError 3] The system cannot find the path specified: './images/'",
|
| 161 |
-
"output_type": "error",
|
| 162 |
-
"traceback": [
|
| 163 |
-
"\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
|
| 164 |
-
"\u001b[1;31mFileNotFoundError\u001b[0m Traceback (most recent call last)",
|
| 165 |
-
"\u001b[1;32m<ipython-input-5-d6f5a6d6ebc5>\u001b[0m in \u001b[0;36m<module>\u001b[1;34m\u001b[0m\n\u001b[0;32m 42\u001b[0m \u001b[0mmodel\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mload_weights\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mweight_file\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 43\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m---> 44\u001b[1;33m \u001b[0mimage_names\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;33m[\u001b[0m\u001b[0mf\u001b[0m \u001b[1;32mfor\u001b[0m \u001b[0mf\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mlistdir\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mimage_path\u001b[0m\u001b[1;33m)\u001b[0m \u001b[1;32mif\u001b[0m \u001b[0misfile\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mjoin\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mimage_path\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mf\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 45\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 46\u001b[0m \u001b[1;32mfor\u001b[0m \u001b[0mimage_name\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mimage_names\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n",
|
| 166 |
-
"\u001b[1;31mFileNotFoundError\u001b[0m: [WinError 3] The system cannot find the path specified: './images/'"
|
| 167 |
-
]
|
| 168 |
-
}
|
| 169 |
-
],
|
| 170 |
-
"source": [
|
| 171 |
-
"from os import listdir\n",
|
| 172 |
-
"from os.path import isfile, join\n",
|
| 173 |
-
"import os\n",
|
| 174 |
-
"import cv2\n",
|
| 175 |
-
"\n",
|
| 176 |
-
"# Define Image Path Here\n",
|
| 177 |
-
"image_path = \"./images/\"\n",
|
| 178 |
-
"\n",
|
| 179 |
-
"modhash = 'fbe63257a054c1c5466cfd7bf14646d6'\n",
|
| 180 |
-
"emotion_classes = {0: 'Angry', 1: 'Fear', 2: 'Happy', 3: 'Neutral', 4: 'Sad', 5: 'Surprise'}\n",
|
| 181 |
-
"\n",
|
| 182 |
-
"def face_detector(img):\n",
|
| 183 |
-
" # Convert image to grayscale for faster detection\n",
|
| 184 |
-
" gray = cv2.cvtColor(img.copy(),cv2.COLOR_BGR2GRAY)\n",
|
| 185 |
-
" faces = face_classifier.detectMultiScale(gray, 1.3, 5)\n",
|
| 186 |
-
" if faces is ():\n",
|
| 187 |
-
" return False ,(0,0,0,0), np.zeros((1,48,48,3), np.uint8), img\n",
|
| 188 |
-
" \n",
|
| 189 |
-
" allfaces = [] \n",
|
| 190 |
-
" rects = []\n",
|
| 191 |
-
" for (x,y,w,h) in faces:\n",
|
| 192 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2)\n",
|
| 193 |
-
" roi = img[y:y+h, x:x+w]\n",
|
| 194 |
-
" allfaces.append(roi)\n",
|
| 195 |
-
" rects.append((x,w,y,h))\n",
|
| 196 |
-
" return True, rects, allfaces, img\n",
|
| 197 |
-
"\n",
|
| 198 |
-
"# Define our model parameters\n",
|
| 199 |
-
"depth = 16\n",
|
| 200 |
-
"k = 8\n",
|
| 201 |
-
"weight_file = None\n",
|
| 202 |
-
"margin = 0.4\n",
|
| 203 |
-
"image_dir = None\n",
|
| 204 |
-
"\n",
|
| 205 |
-
"# Get our weight file \n",
|
| 206 |
-
"if not weight_file:\n",
|
| 207 |
-
" weight_file = get_file(\"weights.28-3.73.hdf5\", pretrained_model, cache_subdir=\"pretrained_models\",\n",
|
| 208 |
-
" file_hash=modhash, cache_dir=Path(sys.argv[0]).resolve().parent)\n",
|
| 209 |
-
"# load model and weights\n",
|
| 210 |
-
"img_size = 64\n",
|
| 211 |
-
"model = WideResNet(img_size, depth=depth, k=k)()\n",
|
| 212 |
-
"model.load_weights(weight_file)\n",
|
| 213 |
-
"\n",
|
| 214 |
-
"image_names = [f for f in listdir(image_path) if isfile(join(image_path, f))]\n",
|
| 215 |
-
"\n",
|
| 216 |
-
"for image_name in image_names:\n",
|
| 217 |
-
" frame = cv2.imread(\"./images/\" + image_name)\n",
|
| 218 |
-
" ret, rects, faces, image = face_detector(frame)\n",
|
| 219 |
-
" preprocessed_faces_ag = []\n",
|
| 220 |
-
" preprocessed_faces_emo = []\n",
|
| 221 |
-
" \n",
|
| 222 |
-
" if ret:\n",
|
| 223 |
-
" for (i,face) in enumerate(faces):\n",
|
| 224 |
-
" face_ag = cv2.resize(face, (64, 64), interpolation = cv2.INTER_AREA)\n",
|
| 225 |
-
" preprocessed_faces_ag.append(face_ag)\n",
|
| 226 |
-
"\n",
|
| 227 |
-
" face_gray_emo = cv2.cvtColor(face, cv2.COLOR_BGR2GRAY)\n",
|
| 228 |
-
" face_gray_emo = cv2.resize(face_gray_emo, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 229 |
-
" face_gray_emo = face_gray_emo.astype(\"float\") / 255.0\n",
|
| 230 |
-
" face_gray_emo = img_to_array(face_gray_emo)\n",
|
| 231 |
-
" face_gray_emo = np.expand_dims(face_gray_emo, axis=0)\n",
|
| 232 |
-
" preprocessed_faces_emo.append(face_gray_emo)\n",
|
| 233 |
-
" \n",
|
| 234 |
-
" # make a prediction for Age and Gender\n",
|
| 235 |
-
" results = model.predict(np.array(preprocessed_faces_ag))\n",
|
| 236 |
-
" predicted_genders = results[0]\n",
|
| 237 |
-
" ages = np.arange(0, 101).reshape(101, 1)\n",
|
| 238 |
-
" predicted_ages = results[1].dot(ages).flatten()\n",
|
| 239 |
-
"\n",
|
| 240 |
-
" # make a prediction for Emotion \n",
|
| 241 |
-
" emo_labels = []\n",
|
| 242 |
-
" for (i, face) in enumerate(faces):\n",
|
| 243 |
-
" preds = classifier.predict(preprocessed_faces_emo[i])[0]\n",
|
| 244 |
-
" emo_labels.append(emotion_classes[preds.argmax()])\n",
|
| 245 |
-
" \n",
|
| 246 |
-
" # draw results, for Age and Gender\n",
|
| 247 |
-
" for (i, face) in enumerate(faces):\n",
|
| 248 |
-
" label = \"{}, {}, {}\".format(int(predicted_ages[i]),\n",
|
| 249 |
-
" \"F\" if predicted_genders[i][0] > 0.4 else \"M\",\n",
|
| 250 |
-
" emo_labels[i])\n",
|
| 251 |
-
" \n",
|
| 252 |
-
" #Overlay our detected emotion on our pic\n",
|
| 253 |
-
" for (i, face) in enumerate(faces):\n",
|
| 254 |
-
" label_position = (rects[i][0] + int((rects[i][1]/2)), abs(rects[i][2] - 10))\n",
|
| 255 |
-
" cv2.putText(image, label, label_position , cv2.FONT_HERSHEY_PLAIN,1, (0,255,0), 2)\n",
|
| 256 |
-
"\n",
|
| 257 |
-
" cv2.imshow(\"Emotion Detector\", image)\n",
|
| 258 |
-
" cv2.waitKey(0)\n",
|
| 259 |
-
"\n",
|
| 260 |
-
"cv2.destroyAllWindows() "
|
| 261 |
-
]
|
| 262 |
-
},
|
| 263 |
-
{
|
| 264 |
-
"cell_type": "markdown",
|
| 265 |
-
"metadata": {},
|
| 266 |
-
"source": [
|
| 267 |
-
"### Using Dlib's Face Detection"
|
| 268 |
-
]
|
| 269 |
-
},
|
| 270 |
-
{
|
| 271 |
-
"cell_type": "code",
|
| 272 |
-
"execution_count": 7,
|
| 273 |
-
"metadata": {},
|
| 274 |
-
"outputs": [],
|
| 275 |
-
"source": [
|
| 276 |
-
"from os import listdir\n",
|
| 277 |
-
"from os.path import isfile, join\n",
|
| 278 |
-
"import os\n",
|
| 279 |
-
"import cv2\n",
|
| 280 |
-
"\n",
|
| 281 |
-
"# Define Image Path Here\n",
|
| 282 |
-
"image_path = \"./images/\"\n",
|
| 283 |
-
"\n",
|
| 284 |
-
"modhash = 'fbe63257a054c1c5466cfd7bf14646d6'\n",
|
| 285 |
-
"emotion_classes = {0: 'Angry', 1: 'Fear', 2: 'Happy', 3: 'Neutral', 4: 'Sad', 5: 'Surprise'}\n",
|
| 286 |
-
"\n",
|
| 287 |
-
"def draw_label(image, point, label, font=cv2.FONT_HERSHEY_SIMPLEX,\n",
|
| 288 |
-
" font_scale=0.8, thickness=1):\n",
|
| 289 |
-
" size = cv2.getTextSize(label, font, font_scale, thickness)[0]\n",
|
| 290 |
-
" x, y = point\n",
|
| 291 |
-
" cv2.rectangle(image, (x, y - size[1]), (x + size[0], y), (255, 0, 0), cv2.FILLED)\n",
|
| 292 |
-
" cv2.putText(image, label, point, font, font_scale, (255, 255, 255), thickness, lineType=cv2.LINE_AA)\n",
|
| 293 |
-
" \n",
|
| 294 |
-
"\n",
|
| 295 |
-
"# Define our model parameters\n",
|
| 296 |
-
"depth = 16\n",
|
| 297 |
-
"k = 8\n",
|
| 298 |
-
"weight_file = None\n",
|
| 299 |
-
"margin = 0.4\n",
|
| 300 |
-
"image_dir = None\n",
|
| 301 |
-
"\n",
|
| 302 |
-
"# Get our weight file \n",
|
| 303 |
-
"if not weight_file:\n",
|
| 304 |
-
" weight_file = get_file(\"weights.28-3.73.hdf5\", pretrained_model, cache_subdir=\"pretrained_models\",\n",
|
| 305 |
-
" file_hash=modhash, cache_dir=Path(sys.argv[0]).resolve().parent)\n",
|
| 306 |
-
"# load model and weights\n",
|
| 307 |
-
"img_size = 64\n",
|
| 308 |
-
"model = WideResNet(img_size, depth=depth, k=k)()\n",
|
| 309 |
-
"model.load_weights(weight_file)\n",
|
| 310 |
-
"\n",
|
| 311 |
-
"detector = dlib.get_frontal_face_detector()\n",
|
| 312 |
-
"\n",
|
| 313 |
-
"image_names = [f for f in listdir(image_path) if isfile(join(image_path, f))]\n",
|
| 314 |
-
"\n",
|
| 315 |
-
"for image_name in image_names:\n",
|
| 316 |
-
" frame = cv2.imread(\"./images/\" + image_name)\n",
|
| 317 |
-
" preprocessed_faces_emo = [] \n",
|
| 318 |
-
" \n",
|
| 319 |
-
" input_img = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n",
|
| 320 |
-
" img_h, img_w, _ = np.shape(input_img)\n",
|
| 321 |
-
" detected = detector(frame, 1)\n",
|
| 322 |
-
" faces = np.empty((len(detected), img_size, img_size, 3))\n",
|
| 323 |
-
" \n",
|
| 324 |
-
" preprocessed_faces_emo = []\n",
|
| 325 |
-
" if len(detected) > 0:\n",
|
| 326 |
-
" for i, d in enumerate(detected):\n",
|
| 327 |
-
" x1, y1, x2, y2, w, h = d.left(), d.top(), d.right() + 1, d.bottom() + 1, d.width(), d.height()\n",
|
| 328 |
-
" xw1 = max(int(x1 - margin * w), 0)\n",
|
| 329 |
-
" yw1 = max(int(y1 - margin * h), 0)\n",
|
| 330 |
-
" xw2 = min(int(x2 + margin * w), img_w - 1)\n",
|
| 331 |
-
" yw2 = min(int(y2 + margin * h), img_h - 1)\n",
|
| 332 |
-
" cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0), 2)\n",
|
| 333 |
-
" # cv2.rectangle(img, (xw1, yw1), (xw2, yw2), (255, 0, 0), 2)\n",
|
| 334 |
-
" faces[i, :, :, :] = cv2.resize(frame[yw1:yw2 + 1, xw1:xw2 + 1, :], (img_size, img_size))\n",
|
| 335 |
-
" face = frame[yw1:yw2 + 1, xw1:xw2 + 1, :]\n",
|
| 336 |
-
" face_gray_emo = cv2.cvtColor(face, cv2.COLOR_BGR2GRAY)\n",
|
| 337 |
-
" face_gray_emo = cv2.resize(face_gray_emo, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 338 |
-
" face_gray_emo = face_gray_emo.astype(\"float\") / 255.0\n",
|
| 339 |
-
" face_gray_emo = img_to_array(face_gray_emo)\n",
|
| 340 |
-
" face_gray_emo = np.expand_dims(face_gray_emo, axis=0)\n",
|
| 341 |
-
" preprocessed_faces_emo.append(face_gray_emo)\n",
|
| 342 |
-
"\n",
|
| 343 |
-
" # make a prediction for Age and Gender\n",
|
| 344 |
-
" results = model.predict(np.array(faces))\n",
|
| 345 |
-
" predicted_genders = results[0]\n",
|
| 346 |
-
" ages = np.arange(0, 101).reshape(101, 1)\n",
|
| 347 |
-
" predicted_ages = results[1].dot(ages).flatten()\n",
|
| 348 |
-
"\n",
|
| 349 |
-
" # make a prediction for Emotion \n",
|
| 350 |
-
" emo_labels = []\n",
|
| 351 |
-
" for i, d in enumerate(detected):\n",
|
| 352 |
-
" preds = classifier.predict(preprocessed_faces_emo[i])[0]\n",
|
| 353 |
-
" emo_labels.append(emotion_classes[preds.argmax()])\n",
|
| 354 |
-
" \n",
|
| 355 |
-
" # draw results\n",
|
| 356 |
-
" for i, d in enumerate(detected):\n",
|
| 357 |
-
" label = \"{}, {}, {}\".format(int(predicted_ages[i]),\n",
|
| 358 |
-
" \"F\" if predicted_genders[i][0] > 0.4 else \"M\", emo_labels[i])\n",
|
| 359 |
-
" draw_label(frame, (d.left(), d.top()), label)\n",
|
| 360 |
-
"\n",
|
| 361 |
-
" cv2.imshow(\"Emotion Detector\", frame)\n",
|
| 362 |
-
" cv2.waitKey(0)\n",
|
| 363 |
-
"\n",
|
| 364 |
-
"cv2.destroyAllWindows() "
|
| 365 |
-
]
|
| 366 |
-
},
|
| 367 |
-
{
|
| 368 |
-
"cell_type": "markdown",
|
| 369 |
-
"metadata": {},
|
| 370 |
-
"source": [
|
| 371 |
-
"### And now using dlib's detector with our webcam"
|
| 372 |
-
]
|
| 373 |
-
},
|
| 374 |
-
{
|
| 375 |
-
"cell_type": "code",
|
| 376 |
-
"execution_count": 8,
|
| 377 |
-
"metadata": {},
|
| 378 |
-
"outputs": [],
|
| 379 |
-
"source": [
|
| 380 |
-
"from os import listdir\n",
|
| 381 |
-
"from os.path import isfile, join\n",
|
| 382 |
-
"import os\n",
|
| 383 |
-
"import cv2\n",
|
| 384 |
-
"\n",
|
| 385 |
-
"# Define Image Path Here\n",
|
| 386 |
-
"image_path = \"./images/\"\n",
|
| 387 |
-
"\n",
|
| 388 |
-
"modhash = 'fbe63257a054c1c5466cfd7bf14646d6'\n",
|
| 389 |
-
"emotion_classes = {0: 'Angry', 1: 'Fear', 2: 'Happy', 3: 'Neutral', 4: 'Sad', 5: 'Surprise'}\n",
|
| 390 |
-
"\n",
|
| 391 |
-
"def draw_label(image, point, label, font=cv2.FONT_HERSHEY_SIMPLEX,\n",
|
| 392 |
-
" font_scale=0.8, thickness=1):\n",
|
| 393 |
-
" size = cv2.getTextSize(label, font, font_scale, thickness)[0]\n",
|
| 394 |
-
" x, y = point\n",
|
| 395 |
-
" cv2.rectangle(image, (x, y - size[1]), (x + size[0], y), (255, 0, 0), cv2.FILLED)\n",
|
| 396 |
-
" cv2.putText(image, label, point, font, font_scale, (255, 255, 255), thickness, lineType=cv2.LINE_AA)\n",
|
| 397 |
-
" \n",
|
| 398 |
-
"\n",
|
| 399 |
-
"# Define our model parameters\n",
|
| 400 |
-
"depth = 16\n",
|
| 401 |
-
"k = 8\n",
|
| 402 |
-
"weight_file = None\n",
|
| 403 |
-
"margin = 0.4\n",
|
| 404 |
-
"image_dir = None\n",
|
| 405 |
-
"\n",
|
| 406 |
-
"# Get our weight file \n",
|
| 407 |
-
"if not weight_file:\n",
|
| 408 |
-
" weight_file = get_file(\"weights.28-3.73.hdf5\", pretrained_model, cache_subdir=\"pretrained_models\",\n",
|
| 409 |
-
" file_hash=modhash, cache_dir=Path(sys.argv[0]).resolve().parent)\n",
|
| 410 |
-
"# load model and weights\n",
|
| 411 |
-
"img_size = 64\n",
|
| 412 |
-
"model = WideResNet(img_size, depth=depth, k=k)()\n",
|
| 413 |
-
"model.load_weights(weight_file)\n",
|
| 414 |
-
"\n",
|
| 415 |
-
"detector = dlib.get_frontal_face_detector()\n",
|
| 416 |
-
"\n",
|
| 417 |
-
"# Initialize Webcam\n",
|
| 418 |
-
"cap = cv2.VideoCapture(0)\n",
|
| 419 |
-
"\n",
|
| 420 |
-
"while True:\n",
|
| 421 |
-
" ret, frame = cap.read()\n",
|
| 422 |
-
" preprocessed_faces_emo = [] \n",
|
| 423 |
-
" \n",
|
| 424 |
-
" input_img = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n",
|
| 425 |
-
" img_h, img_w, _ = np.shape(input_img)\n",
|
| 426 |
-
" detected = detector(frame, 1)\n",
|
| 427 |
-
" faces = np.empty((len(detected), img_size, img_size, 3))\n",
|
| 428 |
-
" \n",
|
| 429 |
-
" preprocessed_faces_emo = []\n",
|
| 430 |
-
" if len(detected) > 0:\n",
|
| 431 |
-
" for i, d in enumerate(detected):\n",
|
| 432 |
-
" x1, y1, x2, y2, w, h = d.left(), d.top(), d.right() + 1, d.bottom() + 1, d.width(), d.height()\n",
|
| 433 |
-
" xw1 = max(int(x1 - margin * w), 0)\n",
|
| 434 |
-
" yw1 = max(int(y1 - margin * h), 0)\n",
|
| 435 |
-
" xw2 = min(int(x2 + margin * w), img_w - 1)\n",
|
| 436 |
-
" yw2 = min(int(y2 + margin * h), img_h - 1)\n",
|
| 437 |
-
" cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0), 2)\n",
|
| 438 |
-
" # cv2.rectangle(img, (xw1, yw1), (xw2, yw2), (255, 0, 0), 2)\n",
|
| 439 |
-
" faces[i, :, :, :] = cv2.resize(frame[yw1:yw2 + 1, xw1:xw2 + 1, :], (img_size, img_size))\n",
|
| 440 |
-
" face = frame[yw1:yw2 + 1, xw1:xw2 + 1, :]\n",
|
| 441 |
-
" face_gray_emo = cv2.cvtColor(face, cv2.COLOR_BGR2GRAY)\n",
|
| 442 |
-
" face_gray_emo = cv2.resize(face_gray_emo, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 443 |
-
" face_gray_emo = face_gray_emo.astype(\"float\") / 255.0\n",
|
| 444 |
-
" face_gray_emo = img_to_array(face_gray_emo)\n",
|
| 445 |
-
" face_gray_emo = np.expand_dims(face_gray_emo, axis=0)\n",
|
| 446 |
-
" preprocessed_faces_emo.append(face_gray_emo)\n",
|
| 447 |
-
"\n",
|
| 448 |
-
" # make a prediction for Age and Gender\n",
|
| 449 |
-
" results = model.predict(np.array(faces))\n",
|
| 450 |
-
" predicted_genders = results[0]\n",
|
| 451 |
-
" ages = np.arange(0, 101).reshape(101, 1)\n",
|
| 452 |
-
" predicted_ages = results[1].dot(ages).flatten()\n",
|
| 453 |
-
"\n",
|
| 454 |
-
" # make a prediction for Emotion \n",
|
| 455 |
-
" emo_labels = []\n",
|
| 456 |
-
" for i, d in enumerate(detected):\n",
|
| 457 |
-
" preds = classifier.predict(preprocessed_faces_emo[i])[0]\n",
|
| 458 |
-
" emo_labels.append(emotion_classes[preds.argmax()])\n",
|
| 459 |
-
" \n",
|
| 460 |
-
" # draw results\n",
|
| 461 |
-
" for i, d in enumerate(detected):\n",
|
| 462 |
-
" label = \"{}, {}, {}\".format(int(predicted_ages[i]),\n",
|
| 463 |
-
" \"F\" if predicted_genders[i][0] > 0.4 else \"M\", emo_labels[i])\n",
|
| 464 |
-
" draw_label(frame, (d.left(), d.top()), label)\n",
|
| 465 |
-
"\n",
|
| 466 |
-
" cv2.imshow(\"Emotion Detector\", frame)\n",
|
| 467 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 468 |
-
" break\n",
|
| 469 |
-
"\n",
|
| 470 |
-
"cap.release()\n",
|
| 471 |
-
"cv2.destroyAllWindows() "
|
| 472 |
-
]
|
| 473 |
-
},
|
| 474 |
-
{
|
| 475 |
-
"cell_type": "code",
|
| 476 |
-
"execution_count": null,
|
| 477 |
-
"metadata": {},
|
| 478 |
-
"outputs": [],
|
| 479 |
-
"source": [
|
| 480 |
-
"\n"
|
| 481 |
-
]
|
| 482 |
-
},
|
| 483 |
-
{
|
| 484 |
-
"cell_type": "code",
|
| 485 |
-
"execution_count": null,
|
| 486 |
-
"metadata": {},
|
| 487 |
-
"outputs": [],
|
| 488 |
-
"source": []
|
| 489 |
-
},
|
| 490 |
-
{
|
| 491 |
-
"cell_type": "code",
|
| 492 |
-
"execution_count": null,
|
| 493 |
-
"metadata": {},
|
| 494 |
-
"outputs": [],
|
| 495 |
-
"source": []
|
| 496 |
-
},
|
| 497 |
-
{
|
| 498 |
-
"cell_type": "code",
|
| 499 |
-
"execution_count": null,
|
| 500 |
-
"metadata": {},
|
| 501 |
-
"outputs": [],
|
| 502 |
-
"source": []
|
| 503 |
-
}
|
| 504 |
-
],
|
| 505 |
-
"metadata": {
|
| 506 |
-
"kernelspec": {
|
| 507 |
-
"display_name": "Python 3",
|
| 508 |
-
"language": "python",
|
| 509 |
-
"name": "python3"
|
| 510 |
-
},
|
| 511 |
-
"language_info": {
|
| 512 |
-
"codemirror_mode": {
|
| 513 |
-
"name": "ipython",
|
| 514 |
-
"version": 3
|
| 515 |
-
},
|
| 516 |
-
"file_extension": ".py",
|
| 517 |
-
"mimetype": "text/x-python",
|
| 518 |
-
"name": "python",
|
| 519 |
-
"nbconvert_exporter": "python",
|
| 520 |
-
"pygments_lexer": "ipython3",
|
| 521 |
-
"version": "3.7.4"
|
| 522 |
-
}
|
| 523 |
-
},
|
| 524 |
-
"nbformat": 4,
|
| 525 |
-
"nbformat_minor": 2
|
| 526 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:082576a2fac2c474e0313a34d9f68cae3a5b93c7c040fc4ec56d116e5b836733
|
| 3 |
+
size 21733
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/Face Detection - Friends Characters.ipynb
CHANGED
|
@@ -1,526 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Basic Deep Learning Face Recogntion\n",
|
| 8 |
-
"## Building a Friends TV Show Character Identifier"
|
| 9 |
-
]
|
| 10 |
-
},
|
| 11 |
-
{
|
| 12 |
-
"cell_type": "markdown",
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"source": [
|
| 15 |
-
"### Let's train our model\n",
|
| 16 |
-
"I've created a dataset with the faces of 4 Friends characters taken from a handful of different scenes."
|
| 17 |
-
]
|
| 18 |
-
},
|
| 19 |
-
{
|
| 20 |
-
"cell_type": "code",
|
| 21 |
-
"execution_count": 33,
|
| 22 |
-
"metadata": {},
|
| 23 |
-
"outputs": [
|
| 24 |
-
{
|
| 25 |
-
"name": "stdout",
|
| 26 |
-
"output_type": "stream",
|
| 27 |
-
"text": [
|
| 28 |
-
"Found 2663 images belonging to 4 classes.\n",
|
| 29 |
-
"Found 955 images belonging to 4 classes.\n"
|
| 30 |
-
]
|
| 31 |
-
}
|
| 32 |
-
],
|
| 33 |
-
"source": [
|
| 34 |
-
"from __future__ import print_function\n",
|
| 35 |
-
"import keras\n",
|
| 36 |
-
"from keras.preprocessing.image import ImageDataGenerator\n",
|
| 37 |
-
"from keras.models import Sequential\n",
|
| 38 |
-
"from keras.layers import Dense, Dropout, Activation, Flatten, BatchNormalization\n",
|
| 39 |
-
"from keras.layers import Conv2D, MaxPooling2D\n",
|
| 40 |
-
"from keras.preprocessing.image import ImageDataGenerator\n",
|
| 41 |
-
"import os\n",
|
| 42 |
-
"\n",
|
| 43 |
-
"num_classes = 4\n",
|
| 44 |
-
"img_rows, img_cols = 48, 48\n",
|
| 45 |
-
"batch_size = 16\n",
|
| 46 |
-
"\n",
|
| 47 |
-
"train_data_dir = './faces/train'\n",
|
| 48 |
-
"validation_data_dir = './faces/validation'\n",
|
| 49 |
-
"\n",
|
| 50 |
-
"# Let's use some data augmentaiton \n",
|
| 51 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 52 |
-
" rescale=1./255,\n",
|
| 53 |
-
" rotation_range=30,\n",
|
| 54 |
-
" shear_range=0.3,\n",
|
| 55 |
-
" zoom_range=0.3,\n",
|
| 56 |
-
" width_shift_range=0.4,\n",
|
| 57 |
-
" height_shift_range=0.4,\n",
|
| 58 |
-
" horizontal_flip=True,\n",
|
| 59 |
-
" fill_mode='nearest')\n",
|
| 60 |
-
" \n",
|
| 61 |
-
"validation_datagen = ImageDataGenerator(rescale=1./255)\n",
|
| 62 |
-
" \n",
|
| 63 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 64 |
-
" train_data_dir,\n",
|
| 65 |
-
" target_size=(img_rows, img_cols),\n",
|
| 66 |
-
" batch_size=batch_size,\n",
|
| 67 |
-
" class_mode='categorical',\n",
|
| 68 |
-
" shuffle=True)\n",
|
| 69 |
-
" \n",
|
| 70 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 71 |
-
" validation_data_dir,\n",
|
| 72 |
-
" target_size=(img_rows, img_cols),\n",
|
| 73 |
-
" batch_size=batch_size,\n",
|
| 74 |
-
" class_mode='categorical',\n",
|
| 75 |
-
" shuffle=True)"
|
| 76 |
-
]
|
| 77 |
-
},
|
| 78 |
-
{
|
| 79 |
-
"cell_type": "code",
|
| 80 |
-
"execution_count": 37,
|
| 81 |
-
"metadata": {},
|
| 82 |
-
"outputs": [],
|
| 83 |
-
"source": [
|
| 84 |
-
"#Our Keras imports\n",
|
| 85 |
-
"from keras.models import Sequential\n",
|
| 86 |
-
"from keras.layers.normalization import BatchNormalization\n",
|
| 87 |
-
"from keras.layers.convolutional import Conv2D, MaxPooling2D\n",
|
| 88 |
-
"from keras.layers.advanced_activations import ELU\n",
|
| 89 |
-
"from keras.layers.core import Activation, Flatten, Dropout, Dense"
|
| 90 |
-
]
|
| 91 |
-
},
|
| 92 |
-
{
|
| 93 |
-
"cell_type": "markdown",
|
| 94 |
-
"metadata": {},
|
| 95 |
-
"source": [
|
| 96 |
-
"### Creating a simple VGG based model for Face Recognition"
|
| 97 |
-
]
|
| 98 |
-
},
|
| 99 |
-
{
|
| 100 |
-
"cell_type": "code",
|
| 101 |
-
"execution_count": 35,
|
| 102 |
-
"metadata": {},
|
| 103 |
-
"outputs": [
|
| 104 |
-
{
|
| 105 |
-
"name": "stdout",
|
| 106 |
-
"output_type": "stream",
|
| 107 |
-
"text": [
|
| 108 |
-
"_________________________________________________________________\n",
|
| 109 |
-
"Layer (type) Output Shape Param # \n",
|
| 110 |
-
"=================================================================\n",
|
| 111 |
-
"conv2d_25 (Conv2D) (None, 48, 48, 32) 896 \n",
|
| 112 |
-
"_________________________________________________________________\n",
|
| 113 |
-
"activation_34 (Activation) (None, 48, 48, 32) 0 \n",
|
| 114 |
-
"_________________________________________________________________\n",
|
| 115 |
-
"batch_normalization_31 (Batc (None, 48, 48, 32) 128 \n",
|
| 116 |
-
"_________________________________________________________________\n",
|
| 117 |
-
"conv2d_26 (Conv2D) (None, 48, 48, 32) 9248 \n",
|
| 118 |
-
"_________________________________________________________________\n",
|
| 119 |
-
"activation_35 (Activation) (None, 48, 48, 32) 0 \n",
|
| 120 |
-
"_________________________________________________________________\n",
|
| 121 |
-
"batch_normalization_32 (Batc (None, 48, 48, 32) 128 \n",
|
| 122 |
-
"_________________________________________________________________\n",
|
| 123 |
-
"max_pooling2d_13 (MaxPooling (None, 24, 24, 32) 0 \n",
|
| 124 |
-
"_________________________________________________________________\n",
|
| 125 |
-
"dropout_19 (Dropout) (None, 24, 24, 32) 0 \n",
|
| 126 |
-
"_________________________________________________________________\n",
|
| 127 |
-
"conv2d_27 (Conv2D) (None, 24, 24, 64) 18496 \n",
|
| 128 |
-
"_________________________________________________________________\n",
|
| 129 |
-
"activation_36 (Activation) (None, 24, 24, 64) 0 \n",
|
| 130 |
-
"_________________________________________________________________\n",
|
| 131 |
-
"batch_normalization_33 (Batc (None, 24, 24, 64) 256 \n",
|
| 132 |
-
"_________________________________________________________________\n",
|
| 133 |
-
"conv2d_28 (Conv2D) (None, 24, 24, 64) 36928 \n",
|
| 134 |
-
"_________________________________________________________________\n",
|
| 135 |
-
"activation_37 (Activation) (None, 24, 24, 64) 0 \n",
|
| 136 |
-
"_________________________________________________________________\n",
|
| 137 |
-
"batch_normalization_34 (Batc (None, 24, 24, 64) 256 \n",
|
| 138 |
-
"_________________________________________________________________\n",
|
| 139 |
-
"max_pooling2d_14 (MaxPooling (None, 12, 12, 64) 0 \n",
|
| 140 |
-
"_________________________________________________________________\n",
|
| 141 |
-
"dropout_20 (Dropout) (None, 12, 12, 64) 0 \n",
|
| 142 |
-
"_________________________________________________________________\n",
|
| 143 |
-
"conv2d_29 (Conv2D) (None, 12, 12, 128) 73856 \n",
|
| 144 |
-
"_________________________________________________________________\n",
|
| 145 |
-
"activation_38 (Activation) (None, 12, 12, 128) 0 \n",
|
| 146 |
-
"_________________________________________________________________\n",
|
| 147 |
-
"batch_normalization_35 (Batc (None, 12, 12, 128) 512 \n",
|
| 148 |
-
"_________________________________________________________________\n",
|
| 149 |
-
"conv2d_30 (Conv2D) (None, 12, 12, 128) 147584 \n",
|
| 150 |
-
"_________________________________________________________________\n",
|
| 151 |
-
"activation_39 (Activation) (None, 12, 12, 128) 0 \n",
|
| 152 |
-
"_________________________________________________________________\n",
|
| 153 |
-
"batch_normalization_36 (Batc (None, 12, 12, 128) 512 \n",
|
| 154 |
-
"_________________________________________________________________\n",
|
| 155 |
-
"max_pooling2d_15 (MaxPooling (None, 6, 6, 128) 0 \n",
|
| 156 |
-
"_________________________________________________________________\n",
|
| 157 |
-
"dropout_21 (Dropout) (None, 6, 6, 128) 0 \n",
|
| 158 |
-
"_________________________________________________________________\n",
|
| 159 |
-
"conv2d_31 (Conv2D) (None, 6, 6, 256) 295168 \n",
|
| 160 |
-
"_________________________________________________________________\n",
|
| 161 |
-
"activation_40 (Activation) (None, 6, 6, 256) 0 \n",
|
| 162 |
-
"_________________________________________________________________\n",
|
| 163 |
-
"batch_normalization_37 (Batc (None, 6, 6, 256) 1024 \n",
|
| 164 |
-
"_________________________________________________________________\n",
|
| 165 |
-
"conv2d_32 (Conv2D) (None, 6, 6, 256) 590080 \n",
|
| 166 |
-
"_________________________________________________________________\n",
|
| 167 |
-
"activation_41 (Activation) (None, 6, 6, 256) 0 \n",
|
| 168 |
-
"_________________________________________________________________\n",
|
| 169 |
-
"batch_normalization_38 (Batc (None, 6, 6, 256) 1024 \n",
|
| 170 |
-
"_________________________________________________________________\n",
|
| 171 |
-
"max_pooling2d_16 (MaxPooling (None, 3, 3, 256) 0 \n",
|
| 172 |
-
"_________________________________________________________________\n",
|
| 173 |
-
"dropout_22 (Dropout) (None, 3, 3, 256) 0 \n",
|
| 174 |
-
"_________________________________________________________________\n",
|
| 175 |
-
"flatten_4 (Flatten) (None, 2304) 0 \n",
|
| 176 |
-
"_________________________________________________________________\n",
|
| 177 |
-
"dense_10 (Dense) (None, 64) 147520 \n",
|
| 178 |
-
"_________________________________________________________________\n",
|
| 179 |
-
"activation_42 (Activation) (None, 64) 0 \n",
|
| 180 |
-
"_________________________________________________________________\n",
|
| 181 |
-
"batch_normalization_39 (Batc (None, 64) 256 \n",
|
| 182 |
-
"_________________________________________________________________\n",
|
| 183 |
-
"dropout_23 (Dropout) (None, 64) 0 \n",
|
| 184 |
-
"_________________________________________________________________\n",
|
| 185 |
-
"dense_11 (Dense) (None, 64) 4160 \n",
|
| 186 |
-
"_________________________________________________________________\n",
|
| 187 |
-
"activation_43 (Activation) (None, 64) 0 \n",
|
| 188 |
-
"_________________________________________________________________\n",
|
| 189 |
-
"batch_normalization_40 (Batc (None, 64) 256 \n",
|
| 190 |
-
"_________________________________________________________________\n",
|
| 191 |
-
"dropout_24 (Dropout) (None, 64) 0 \n",
|
| 192 |
-
"_________________________________________________________________\n",
|
| 193 |
-
"dense_12 (Dense) (None, 4) 260 \n",
|
| 194 |
-
"_________________________________________________________________\n",
|
| 195 |
-
"activation_44 (Activation) (None, 4) 0 \n",
|
| 196 |
-
"=================================================================\n",
|
| 197 |
-
"Total params: 1,328,548\n",
|
| 198 |
-
"Trainable params: 1,326,372\n",
|
| 199 |
-
"Non-trainable params: 2,176\n",
|
| 200 |
-
"_________________________________________________________________\n",
|
| 201 |
-
"None\n"
|
| 202 |
-
]
|
| 203 |
-
}
|
| 204 |
-
],
|
| 205 |
-
"source": [
|
| 206 |
-
"model = Sequential()\n",
|
| 207 |
-
"\n",
|
| 208 |
-
"model.add(Conv2D(32, (3, 3), padding = 'same', kernel_initializer=\"he_normal\",\n",
|
| 209 |
-
" input_shape = (img_rows, img_cols, 3)))\n",
|
| 210 |
-
"model.add(Activation('elu'))\n",
|
| 211 |
-
"model.add(BatchNormalization())\n",
|
| 212 |
-
"model.add(Conv2D(32, (3, 3), padding = \"same\", kernel_initializer=\"he_normal\", \n",
|
| 213 |
-
" input_shape = (img_rows, img_cols, 3)))\n",
|
| 214 |
-
"model.add(Activation('elu'))\n",
|
| 215 |
-
"model.add(BatchNormalization())\n",
|
| 216 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 217 |
-
"model.add(Dropout(0.2))\n",
|
| 218 |
-
"\n",
|
| 219 |
-
"# Block #2: second CONV => RELU => CONV => RELU => POOL\n",
|
| 220 |
-
"# layer set\n",
|
| 221 |
-
"model.add(Conv2D(64, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 222 |
-
"model.add(Activation('elu'))\n",
|
| 223 |
-
"model.add(BatchNormalization())\n",
|
| 224 |
-
"model.add(Conv2D(64, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 225 |
-
"model.add(Activation('elu'))\n",
|
| 226 |
-
"model.add(BatchNormalization())\n",
|
| 227 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 228 |
-
"model.add(Dropout(0.2))\n",
|
| 229 |
-
"\n",
|
| 230 |
-
"# Block #3: third CONV => RELU => CONV => RELU => POOL\n",
|
| 231 |
-
"# layer set\n",
|
| 232 |
-
"model.add(Conv2D(128, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 233 |
-
"model.add(Activation('elu'))\n",
|
| 234 |
-
"model.add(BatchNormalization())\n",
|
| 235 |
-
"model.add(Conv2D(128, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 236 |
-
"model.add(Activation('elu'))\n",
|
| 237 |
-
"model.add(BatchNormalization())\n",
|
| 238 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 239 |
-
"model.add(Dropout(0.2))\n",
|
| 240 |
-
"\n",
|
| 241 |
-
"# Block #4: third CONV => RELU => CONV => RELU => POOL\n",
|
| 242 |
-
"# layer set\n",
|
| 243 |
-
"model.add(Conv2D(256, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 244 |
-
"model.add(Activation('elu'))\n",
|
| 245 |
-
"model.add(BatchNormalization())\n",
|
| 246 |
-
"model.add(Conv2D(256, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 247 |
-
"model.add(Activation('elu'))\n",
|
| 248 |
-
"model.add(BatchNormalization())\n",
|
| 249 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 250 |
-
"model.add(Dropout(0.2))\n",
|
| 251 |
-
"\n",
|
| 252 |
-
"# Block #5: first set of FC => RELU layers\n",
|
| 253 |
-
"model.add(Flatten())\n",
|
| 254 |
-
"model.add(Dense(64, kernel_initializer=\"he_normal\"))\n",
|
| 255 |
-
"model.add(Activation('elu'))\n",
|
| 256 |
-
"model.add(BatchNormalization())\n",
|
| 257 |
-
"model.add(Dropout(0.5))\n",
|
| 258 |
-
"\n",
|
| 259 |
-
"# Block #6: second set of FC => RELU layers\n",
|
| 260 |
-
"model.add(Dense(64, kernel_initializer=\"he_normal\"))\n",
|
| 261 |
-
"model.add(Activation('elu'))\n",
|
| 262 |
-
"model.add(BatchNormalization())\n",
|
| 263 |
-
"model.add(Dropout(0.5))\n",
|
| 264 |
-
"\n",
|
| 265 |
-
"# Block #7: softmax classifier\n",
|
| 266 |
-
"model.add(Dense(num_classes, kernel_initializer=\"he_normal\"))\n",
|
| 267 |
-
"model.add(Activation(\"softmax\"))\n",
|
| 268 |
-
"\n",
|
| 269 |
-
"print(model.summary())"
|
| 270 |
-
]
|
| 271 |
-
},
|
| 272 |
-
{
|
| 273 |
-
"cell_type": "markdown",
|
| 274 |
-
"metadata": {},
|
| 275 |
-
"source": [
|
| 276 |
-
"### Training our Model"
|
| 277 |
-
]
|
| 278 |
-
},
|
| 279 |
-
{
|
| 280 |
-
"cell_type": "code",
|
| 281 |
-
"execution_count": 36,
|
| 282 |
-
"metadata": {},
|
| 283 |
-
"outputs": [
|
| 284 |
-
{
|
| 285 |
-
"name": "stdout",
|
| 286 |
-
"output_type": "stream",
|
| 287 |
-
"text": [
|
| 288 |
-
"Epoch 1/10\n",
|
| 289 |
-
"166/166 [==============================] - 76s 457ms/step - loss: 1.1153 - acc: 0.5700 - val_loss: 1.4428 - val_acc: 0.4841\n",
|
| 290 |
-
"\n",
|
| 291 |
-
"Epoch 00001: val_loss improved from inf to 1.44279, saving model to /home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5\n",
|
| 292 |
-
"Epoch 2/10\n",
|
| 293 |
-
"166/166 [==============================] - 67s 403ms/step - loss: 0.7034 - acc: 0.7343 - val_loss: 3.7705 - val_acc: 0.2705\n",
|
| 294 |
-
"\n",
|
| 295 |
-
"Epoch 00002: val_loss did not improve from 1.44279\n",
|
| 296 |
-
"Epoch 3/10\n",
|
| 297 |
-
"166/166 [==============================] - 62s 373ms/step - loss: 0.6037 - acc: 0.7690 - val_loss: 0.9403 - val_acc: 0.6912\n",
|
| 298 |
-
"\n",
|
| 299 |
-
"Epoch 00003: val_loss improved from 1.44279 to 0.94025, saving model to /home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5\n",
|
| 300 |
-
"Epoch 4/10\n",
|
| 301 |
-
"166/166 [==============================] - 62s 373ms/step - loss: 0.5432 - acc: 0.7988 - val_loss: 1.3018 - val_acc: 0.5548\n",
|
| 302 |
-
"\n",
|
| 303 |
-
"Epoch 00004: val_loss did not improve from 0.94025\n",
|
| 304 |
-
"Epoch 5/10\n",
|
| 305 |
-
"166/166 [==============================] - 69s 414ms/step - loss: 0.4715 - acc: 0.8301 - val_loss: 3.8879 - val_acc: 0.1534\n",
|
| 306 |
-
"\n",
|
| 307 |
-
"Epoch 00005: val_loss did not improve from 0.94025\n",
|
| 308 |
-
"Epoch 6/10\n",
|
| 309 |
-
"166/166 [==============================] - 77s 467ms/step - loss: 0.4233 - acc: 0.8524 - val_loss: 0.6878 - val_acc: 0.7093\n",
|
| 310 |
-
"\n",
|
| 311 |
-
"Epoch 00006: val_loss improved from 0.94025 to 0.68784, saving model to /home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5\n",
|
| 312 |
-
"Epoch 7/10\n",
|
| 313 |
-
"166/166 [==============================] - 71s 429ms/step - loss: 0.4130 - acc: 0.8636 - val_loss: 3.3402 - val_acc: 0.2971\n",
|
| 314 |
-
"\n",
|
| 315 |
-
"Epoch 00007: val_loss did not improve from 0.68784\n",
|
| 316 |
-
"Epoch 8/10\n",
|
| 317 |
-
"166/166 [==============================] - 79s 477ms/step - loss: 0.3821 - acc: 0.8748 - val_loss: 2.6729 - val_acc: 0.6283\n",
|
| 318 |
-
"\n",
|
| 319 |
-
"Epoch 00008: val_loss did not improve from 0.68784\n",
|
| 320 |
-
"Epoch 9/10\n",
|
| 321 |
-
"166/166 [==============================] - 86s 519ms/step - loss: 0.3622 - acc: 0.8709 - val_loss: 1.5067 - val_acc: 0.5197\n",
|
| 322 |
-
"Restoring model weights from the end of the best epoch\n",
|
| 323 |
-
"\n",
|
| 324 |
-
"Epoch 00009: val_loss did not improve from 0.68784\n",
|
| 325 |
-
"\n",
|
| 326 |
-
"Epoch 00009: ReduceLROnPlateau reducing learning rate to 0.0019999999552965165.\n",
|
| 327 |
-
"Epoch 00009: early stopping\n"
|
| 328 |
-
]
|
| 329 |
-
}
|
| 330 |
-
],
|
| 331 |
-
"source": [
|
| 332 |
-
"from keras.optimizers import RMSprop, SGD, Adam\n",
|
| 333 |
-
"from keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau\n",
|
| 334 |
-
"\n",
|
| 335 |
-
" \n",
|
| 336 |
-
"checkpoint = ModelCheckpoint(\"/home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5\",\n",
|
| 337 |
-
" monitor=\"val_loss\",\n",
|
| 338 |
-
" mode=\"min\",\n",
|
| 339 |
-
" save_best_only = True,\n",
|
| 340 |
-
" verbose=1)\n",
|
| 341 |
-
"\n",
|
| 342 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', \n",
|
| 343 |
-
" min_delta = 0, \n",
|
| 344 |
-
" patience = 3,\n",
|
| 345 |
-
" verbose = 1,\n",
|
| 346 |
-
" restore_best_weights = True)\n",
|
| 347 |
-
"\n",
|
| 348 |
-
"reduce_lr = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.2, patience = 3, verbose = 1, min_delta = 0.0001)\n",
|
| 349 |
-
"\n",
|
| 350 |
-
"# we put our call backs into a callback list\n",
|
| 351 |
-
"callbacks = [earlystop, checkpoint, reduce_lr]\n",
|
| 352 |
-
"\n",
|
| 353 |
-
"# We use a very small learning rate \n",
|
| 354 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 355 |
-
" optimizer = Adam(lr=0.01),\n",
|
| 356 |
-
" metrics = ['accuracy'])\n",
|
| 357 |
-
"\n",
|
| 358 |
-
"nb_train_samples = 2663\n",
|
| 359 |
-
"nb_validation_samples = 955\n",
|
| 360 |
-
"epochs = 10\n",
|
| 361 |
-
"\n",
|
| 362 |
-
"history = model.fit_generator(\n",
|
| 363 |
-
" train_generator,\n",
|
| 364 |
-
" steps_per_epoch = nb_train_samples // batch_size,\n",
|
| 365 |
-
" epochs = epochs,\n",
|
| 366 |
-
" callbacks = callbacks,\n",
|
| 367 |
-
" validation_data = validation_generator,\n",
|
| 368 |
-
" validation_steps = nb_validation_samples // batch_size)"
|
| 369 |
-
]
|
| 370 |
-
},
|
| 371 |
-
{
|
| 372 |
-
"cell_type": "markdown",
|
| 373 |
-
"metadata": {},
|
| 374 |
-
"source": [
|
| 375 |
-
"#### Getting our Class Labels"
|
| 376 |
-
]
|
| 377 |
-
},
|
| 378 |
-
{
|
| 379 |
-
"cell_type": "code",
|
| 380 |
-
"execution_count": 39,
|
| 381 |
-
"metadata": {},
|
| 382 |
-
"outputs": [
|
| 383 |
-
{
|
| 384 |
-
"data": {
|
| 385 |
-
"text/plain": [
|
| 386 |
-
"{0: 'Chandler', 1: 'Joey', 2: 'Pheobe', 3: 'Rachel'}"
|
| 387 |
-
]
|
| 388 |
-
},
|
| 389 |
-
"execution_count": 39,
|
| 390 |
-
"metadata": {},
|
| 391 |
-
"output_type": "execute_result"
|
| 392 |
-
}
|
| 393 |
-
],
|
| 394 |
-
"source": [
|
| 395 |
-
"class_labels = validation_generator.class_indices\n",
|
| 396 |
-
"class_labels = {v: k for k, v in class_labels.items()}\n",
|
| 397 |
-
"classes = list(class_labels.values())\n",
|
| 398 |
-
"class_labels"
|
| 399 |
-
]
|
| 400 |
-
},
|
| 401 |
-
{
|
| 402 |
-
"cell_type": "code",
|
| 403 |
-
"execution_count": null,
|
| 404 |
-
"metadata": {},
|
| 405 |
-
"outputs": [],
|
| 406 |
-
"source": [
|
| 407 |
-
"# Load our model\n",
|
| 408 |
-
"from keras.models import load_model\n",
|
| 409 |
-
"\n",
|
| 410 |
-
"classifier = load_model('/home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5')"
|
| 411 |
-
]
|
| 412 |
-
},
|
| 413 |
-
{
|
| 414 |
-
"cell_type": "markdown",
|
| 415 |
-
"metadata": {},
|
| 416 |
-
"source": [
|
| 417 |
-
"### Testing our model on some real video"
|
| 418 |
-
]
|
| 419 |
-
},
|
| 420 |
-
{
|
| 421 |
-
"cell_type": "code",
|
| 422 |
-
"execution_count": 43,
|
| 423 |
-
"metadata": {},
|
| 424 |
-
"outputs": [],
|
| 425 |
-
"source": [
|
| 426 |
-
"from os import listdir\n",
|
| 427 |
-
"from os.path import isfile, join\n",
|
| 428 |
-
"import os\n",
|
| 429 |
-
"import cv2\n",
|
| 430 |
-
"import numpy as np\n",
|
| 431 |
-
"\n",
|
| 432 |
-
"\n",
|
| 433 |
-
"face_classes = {0: 'Chandler', 1: 'Joey', 2: 'Pheobe', 3: 'Rachel'}\n",
|
| 434 |
-
"\n",
|
| 435 |
-
"def draw_label(image, point, label, font=cv2.FONT_HERSHEY_SIMPLEX,\n",
|
| 436 |
-
" font_scale=0.8, thickness=1):\n",
|
| 437 |
-
" size = cv2.getTextSize(label, font, font_scale, thickness)[0]\n",
|
| 438 |
-
" x, y = point\n",
|
| 439 |
-
" cv2.rectangle(image, (x, y - size[1]), (x + size[0], y), (255, 0, 0), cv2.FILLED)\n",
|
| 440 |
-
" cv2.putText(image, label, point, font, font_scale, (255, 255, 255), thickness, lineType=cv2.LINE_AA)\n",
|
| 441 |
-
" \n",
|
| 442 |
-
"margin = 0.2\n",
|
| 443 |
-
"# load model and weights\n",
|
| 444 |
-
"img_size = 64\n",
|
| 445 |
-
"\n",
|
| 446 |
-
"detector = dlib.get_frontal_face_detector()\n",
|
| 447 |
-
"\n",
|
| 448 |
-
"cap = cv2.VideoCapture('testfriends.mp4')\n",
|
| 449 |
-
"\n",
|
| 450 |
-
"while True:\n",
|
| 451 |
-
" ret, frame = cap.read()\n",
|
| 452 |
-
" frame = cv2.resize(frame, None, fx=0.5, fy=0.5, interpolation = cv2.INTER_LINEAR)\n",
|
| 453 |
-
" preprocessed_faces = [] \n",
|
| 454 |
-
" \n",
|
| 455 |
-
" input_img = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n",
|
| 456 |
-
" img_h, img_w, _ = np.shape(input_img)\n",
|
| 457 |
-
" detected = detector(frame, 1)\n",
|
| 458 |
-
" faces = np.empty((len(detected), img_size, img_size, 3))\n",
|
| 459 |
-
" \n",
|
| 460 |
-
" preprocessed_faces_emo = []\n",
|
| 461 |
-
" if len(detected) > 0:\n",
|
| 462 |
-
" for i, d in enumerate(detected):\n",
|
| 463 |
-
" x1, y1, x2, y2, w, h = d.left(), d.top(), d.right() + 1, d.bottom() + 1, d.width(), d.height()\n",
|
| 464 |
-
" xw1 = max(int(x1 - margin * w), 0)\n",
|
| 465 |
-
" yw1 = max(int(y1 - margin * h), 0)\n",
|
| 466 |
-
" xw2 = min(int(x2 + margin * w), img_w - 1)\n",
|
| 467 |
-
" yw2 = min(int(y2 + margin * h), img_h - 1)\n",
|
| 468 |
-
" cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0), 2)\n",
|
| 469 |
-
" # cv2.rectangle(img, (xw1, yw1), (xw2, yw2), (255, 0, 0), 2)\n",
|
| 470 |
-
" #faces[i, :, :, :] = cv2.resize(frame[yw1:yw2 + 1, xw1:xw2 + 1, :], (img_size, img_size))\n",
|
| 471 |
-
" face = frame[yw1:yw2 + 1, xw1:xw2 + 1, :]\n",
|
| 472 |
-
" face = cv2.resize(face, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 473 |
-
" face = face.astype(\"float\") / 255.0\n",
|
| 474 |
-
" face = img_to_array(face)\n",
|
| 475 |
-
" face = np.expand_dims(face, axis=0)\n",
|
| 476 |
-
" preprocessed_faces.append(face)\n",
|
| 477 |
-
"\n",
|
| 478 |
-
" # make a prediction for Emotion \n",
|
| 479 |
-
" face_labels = []\n",
|
| 480 |
-
" for i, d in enumerate(detected):\n",
|
| 481 |
-
" preds = classifier.predict(preprocessed_faces[i])[0]\n",
|
| 482 |
-
" face_labels.append(face_classes[preds.argmax()])\n",
|
| 483 |
-
" \n",
|
| 484 |
-
" # draw results\n",
|
| 485 |
-
" for i, d in enumerate(detected):\n",
|
| 486 |
-
" label = \"{}\".format(face_labels[i])\n",
|
| 487 |
-
" draw_label(frame, (d.left(), d.top()), label)\n",
|
| 488 |
-
"\n",
|
| 489 |
-
" cv2.imshow(\"Friend Character Identifier\", frame)\n",
|
| 490 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 491 |
-
" break\n",
|
| 492 |
-
"\n",
|
| 493 |
-
"cap.release()\n",
|
| 494 |
-
"cv2.destroyAllWindows() "
|
| 495 |
-
]
|
| 496 |
-
},
|
| 497 |
-
{
|
| 498 |
-
"cell_type": "code",
|
| 499 |
-
"execution_count": null,
|
| 500 |
-
"metadata": {},
|
| 501 |
-
"outputs": [],
|
| 502 |
-
"source": []
|
| 503 |
-
}
|
| 504 |
-
],
|
| 505 |
-
"metadata": {
|
| 506 |
-
"kernelspec": {
|
| 507 |
-
"display_name": "Python 3",
|
| 508 |
-
"language": "python",
|
| 509 |
-
"name": "python3"
|
| 510 |
-
},
|
| 511 |
-
"language_info": {
|
| 512 |
-
"codemirror_mode": {
|
| 513 |
-
"name": "ipython",
|
| 514 |
-
"version": 3
|
| 515 |
-
},
|
| 516 |
-
"file_extension": ".py",
|
| 517 |
-
"mimetype": "text/x-python",
|
| 518 |
-
"name": "python",
|
| 519 |
-
"nbconvert_exporter": "python",
|
| 520 |
-
"pygments_lexer": "ipython3",
|
| 521 |
-
"version": "3.6.6"
|
| 522 |
-
}
|
| 523 |
-
},
|
| 524 |
-
"nbformat": 4,
|
| 525 |
-
"nbformat_minor": 2
|
| 526 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:04d2f477b406967b9799263dd60be2887f2e6b89c802c005d9b0d49cda5fd575
|
| 3 |
+
size 22647
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 . Deep Survaliance - Build a Face Detector with Emotion, Age and Gender Recognition/Face Extraction from Video.ipynb
CHANGED
|
@@ -1,93 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Extracting the faces from a video"
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "code",
|
| 12 |
-
"execution_count": null,
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"outputs": [],
|
| 15 |
-
"source": [
|
| 16 |
-
"from os import listdir\n",
|
| 17 |
-
"from os.path import isfile, join\n",
|
| 18 |
-
"import os\n",
|
| 19 |
-
"import cv2\n",
|
| 20 |
-
"import dlib\n",
|
| 21 |
-
"import numpy as np\n",
|
| 22 |
-
"\n",
|
| 23 |
-
"# Define Image Path Here\n",
|
| 24 |
-
"image_path = \"./images/\"\n",
|
| 25 |
-
"\n",
|
| 26 |
-
"def draw_label(image, point, label, font=cv2.FONT_HERSHEY_SIMPLEX,\n",
|
| 27 |
-
" font_scale=0.8, thickness=1):\n",
|
| 28 |
-
" size = cv2.getTextSize(label, font, font_scale, thickness)[0]\n",
|
| 29 |
-
" x, y = point\n",
|
| 30 |
-
" cv2.rectangle(image, (x, y - size[1]), (x + size[0], y), (255, 0, 0), cv2.FILLED)\n",
|
| 31 |
-
" cv2.putText(image, label, point, font, font_scale, (255, 255, 255), thickness, lineType=cv2.LINE_AA)\n",
|
| 32 |
-
" \n",
|
| 33 |
-
"detector = dlib.get_frontal_face_detector()\n",
|
| 34 |
-
"\n",
|
| 35 |
-
"# Initialize Webcam\n",
|
| 36 |
-
"cap = cv2.VideoCapture('testfriends.mp4')\n",
|
| 37 |
-
"img_size = 64\n",
|
| 38 |
-
"margin = 0.2\n",
|
| 39 |
-
"frame_count = 0\n",
|
| 40 |
-
"\n",
|
| 41 |
-
"while True:\n",
|
| 42 |
-
" ret, frame = cap.read()\n",
|
| 43 |
-
" frame_count += 1\n",
|
| 44 |
-
" print(frame_count) \n",
|
| 45 |
-
" \n",
|
| 46 |
-
" input_img = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n",
|
| 47 |
-
" img_h, img_w, _ = np.shape(input_img)\n",
|
| 48 |
-
" detected = detector(frame, 1)\n",
|
| 49 |
-
" faces = []\n",
|
| 50 |
-
" \n",
|
| 51 |
-
" if len(detected) > 0:\n",
|
| 52 |
-
" for i, d in enumerate(detected):\n",
|
| 53 |
-
" x1, y1, x2, y2, w, h = d.left(), d.top(), d.right() + 1, d.bottom() + 1, d.width(), d.height()\n",
|
| 54 |
-
" xw1 = max(int(x1 - margin * w), 0)\n",
|
| 55 |
-
" yw1 = max(int(y1 - margin * h), 0)\n",
|
| 56 |
-
" xw2 = min(int(x2 + margin * w), img_w - 1)\n",
|
| 57 |
-
" yw2 = min(int(y2 + margin * h), img_h - 1)\n",
|
| 58 |
-
" face = frame[yw1:yw2 + 1, xw1:xw2 + 1, :]\n",
|
| 59 |
-
" file_name = \"./faces/\"+str(frame_count)+\"_\"+str(i)+\".jpg\"\n",
|
| 60 |
-
" cv2.imwrite(file_name, face)\n",
|
| 61 |
-
" cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0), 2)\n",
|
| 62 |
-
"\n",
|
| 63 |
-
" cv2.imshow(\"Face Detector\", frame)\n",
|
| 64 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 65 |
-
" break\n",
|
| 66 |
-
"\n",
|
| 67 |
-
"cap.release()\n",
|
| 68 |
-
"cv2.destroyAllWindows() "
|
| 69 |
-
]
|
| 70 |
-
}
|
| 71 |
-
],
|
| 72 |
-
"metadata": {
|
| 73 |
-
"kernelspec": {
|
| 74 |
-
"display_name": "Python 3",
|
| 75 |
-
"language": "python",
|
| 76 |
-
"name": "python3"
|
| 77 |
-
},
|
| 78 |
-
"language_info": {
|
| 79 |
-
"codemirror_mode": {
|
| 80 |
-
"name": "ipython",
|
| 81 |
-
"version": 3
|
| 82 |
-
},
|
| 83 |
-
"file_extension": ".py",
|
| 84 |
-
"mimetype": "text/x-python",
|
| 85 |
-
"name": "python",
|
| 86 |
-
"nbconvert_exporter": "python",
|
| 87 |
-
"pygments_lexer": "ipython3",
|
| 88 |
-
"version": "3.6.6"
|
| 89 |
-
}
|
| 90 |
-
},
|
| 91 |
-
"nbformat": 4,
|
| 92 |
-
"nbformat_minor": 2
|
| 93 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:c0913e972eb5a6523ab5fb3ff200aabf26e2a6dcd08c4a7f6683fd868ef956bb
|
| 3 |
+
size 2867
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
19. Medical Imaging Segmentation using U-Net/U-Net (not compatible with TensorFlow 2.0, required to downgrade).ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
21. TensforFlow Object Detection/object_detection_tutorial.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
23. DeepDream & Neural Style Transfers/.~24.1 DeepDream.ipynb
CHANGED
|
@@ -1,309 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"## Implementing DeepDream in Keras\n",
|
| 8 |
-
"\n",
|
| 9 |
-
"We firstly load the InceptionV3 model which tends to produce some of the best visuals. Feel free to try VGG16, VGG19, Xception and ResNet50.\n",
|
| 10 |
-
"\n",
|
| 11 |
-
"Code obtained and edited from F. Chollet (Created of Keras)\n",
|
| 12 |
-
"- \n",
|
| 13 |
-
"https://github.com/fchollet/deep-learning-with-python-notebooks/blob/master/8.2-deep-dream.ipynb"
|
| 14 |
-
]
|
| 15 |
-
},
|
| 16 |
-
{
|
| 17 |
-
"cell_type": "code",
|
| 18 |
-
"execution_count": 1,
|
| 19 |
-
"metadata": {},
|
| 20 |
-
"outputs": [
|
| 21 |
-
{
|
| 22 |
-
"name": "stderr",
|
| 23 |
-
"output_type": "stream",
|
| 24 |
-
"text": [
|
| 25 |
-
"Using TensorFlow backend.\n"
|
| 26 |
-
]
|
| 27 |
-
}
|
| 28 |
-
],
|
| 29 |
-
"source": [
|
| 30 |
-
"from keras.applications import inception_v3, resnet50\n",
|
| 31 |
-
"from keras import backend as K\n",
|
| 32 |
-
"\n",
|
| 33 |
-
"# This settings disables all training specific operations\n",
|
| 34 |
-
"K.set_learning_phase(0)\n",
|
| 35 |
-
"\n",
|
| 36 |
-
"# Load InceptionV3\n",
|
| 37 |
-
"model = inception_v3.InceptionV3(weights = 'imagenet', include_top = False)\n"
|
| 38 |
-
]
|
| 39 |
-
},
|
| 40 |
-
{
|
| 41 |
-
"cell_type": "markdown",
|
| 42 |
-
"metadata": {},
|
| 43 |
-
"source": [
|
| 44 |
-
"#### We create a dictionary of coefficients quantifying, how much the layer’s activation contributes to the loss you’ll seek to maximize"
|
| 45 |
-
]
|
| 46 |
-
},
|
| 47 |
-
{
|
| 48 |
-
"cell_type": "code",
|
| 49 |
-
"execution_count": 2,
|
| 50 |
-
"metadata": {},
|
| 51 |
-
"outputs": [],
|
| 52 |
-
"source": [
|
| 53 |
-
"layer_contributions = {\n",
|
| 54 |
-
" 'mixed2': 0.7,\n",
|
| 55 |
-
" 'mixed3': 2.2,\n",
|
| 56 |
-
" 'mixed4': 1.2,\n",
|
| 57 |
-
" 'mixed5': .2,\n",
|
| 58 |
-
"}"
|
| 59 |
-
]
|
| 60 |
-
},
|
| 61 |
-
{
|
| 62 |
-
"cell_type": "markdown",
|
| 63 |
-
"metadata": {},
|
| 64 |
-
"source": [
|
| 65 |
-
"#### Define our tensor that contains the maximized Loss (the weighted sum of the L2 norm of the activations of the layer desfined above)"
|
| 66 |
-
]
|
| 67 |
-
},
|
| 68 |
-
{
|
| 69 |
-
"cell_type": "code",
|
| 70 |
-
"execution_count": 3,
|
| 71 |
-
"metadata": {},
|
| 72 |
-
"outputs": [
|
| 73 |
-
{
|
| 74 |
-
"name": "stdout",
|
| 75 |
-
"output_type": "stream",
|
| 76 |
-
"text": [
|
| 77 |
-
"WARNING:tensorflow:Variable += will be deprecated. Use variable.assign_add if you want assignment to the variable value or 'x = x + y' if you want a new python Tensor object.\n"
|
| 78 |
-
]
|
| 79 |
-
}
|
| 80 |
-
],
|
| 81 |
-
"source": [
|
| 82 |
-
"# Map layer names to layer instances\n",
|
| 83 |
-
"layer_dict = dict([(layer.name, layer) for layer in model.layers])\n",
|
| 84 |
-
"\n",
|
| 85 |
-
"# loss defined by adding layer contributions\n",
|
| 86 |
-
"loss = K.variable(0.)\n",
|
| 87 |
-
"\n",
|
| 88 |
-
"for layer_name in layer_contributions:\n",
|
| 89 |
-
" coeff = layer_contributions[layer_name]\n",
|
| 90 |
-
" \n",
|
| 91 |
-
" # activation gets the layer output\n",
|
| 92 |
-
" activation = layer_dict[layer_name].output\n",
|
| 93 |
-
" scaling = K.prod(K.cast(K.shape(activation), 'float32'))\n",
|
| 94 |
-
"\n",
|
| 95 |
-
" # we add the l2 norm\n",
|
| 96 |
-
" loss += coeff * K.sum(K.square(activation[:, 2: -2, 2: -2, :])) / scaling"
|
| 97 |
-
]
|
| 98 |
-
},
|
| 99 |
-
{
|
| 100 |
-
"cell_type": "markdown",
|
| 101 |
-
"metadata": {},
|
| 102 |
-
"source": [
|
| 103 |
-
"### Creating the Gradient-Ascent Process"
|
| 104 |
-
]
|
| 105 |
-
},
|
| 106 |
-
{
|
| 107 |
-
"cell_type": "code",
|
| 108 |
-
"execution_count": 4,
|
| 109 |
-
"metadata": {},
|
| 110 |
-
"outputs": [],
|
| 111 |
-
"source": [
|
| 112 |
-
"# this is the image or 'dream' :) that is stored in this tensor \n",
|
| 113 |
-
"dream = model.input\n",
|
| 114 |
-
"\n",
|
| 115 |
-
"# Obtains the gradients wrt to the loss\n",
|
| 116 |
-
"grads = K.gradients(loss, dream)[0]\n",
|
| 117 |
-
"\n",
|
| 118 |
-
"# Normalizes the gradient \n",
|
| 119 |
-
"grads /= K.maximum(K.mean(K.abs(grads)), 1e-7)\n",
|
| 120 |
-
"\n",
|
| 121 |
-
"# Creates a Keras function to get the value of the loss & gradients wrt to the input\n",
|
| 122 |
-
"outputs = [loss, grads]\n",
|
| 123 |
-
"fetch_loss_and_grads = K.function([dream], outputs)\n",
|
| 124 |
-
"\n",
|
| 125 |
-
"def eval_loss_and_grads(x):\n",
|
| 126 |
-
" \"\"\"returns the loss and gradient values\"\"\"\n",
|
| 127 |
-
" outs = fetch_loss_and_grads([x])\n",
|
| 128 |
-
" loss_value = outs[0]\n",
|
| 129 |
-
" grad_values = outs[1]\n",
|
| 130 |
-
" return loss_value, grad_values\n",
|
| 131 |
-
"\n",
|
| 132 |
-
"def gradient_ascent(x, iterations, step, max_loss=None):\n",
|
| 133 |
-
" \"\"\"Implements gradient access for a specified number of iterations\"\"\"\n",
|
| 134 |
-
" for i in range(iterations):\n",
|
| 135 |
-
" loss_value, grad_values = eval_loss_and_grads(x)\n",
|
| 136 |
-
" if max_loss is not None and loss_value > max_loss:\n",
|
| 137 |
-
" break\n",
|
| 138 |
-
" print('...Loss value at', i, ':', loss_value)\n",
|
| 139 |
-
" x += step * grad_values\n",
|
| 140 |
-
" return x"
|
| 141 |
-
]
|
| 142 |
-
},
|
| 143 |
-
{
|
| 144 |
-
"cell_type": "markdown",
|
| 145 |
-
"metadata": {},
|
| 146 |
-
"source": [
|
| 147 |
-
"## Implementing the Deep Dream Algorithm"
|
| 148 |
-
]
|
| 149 |
-
},
|
| 150 |
-
{
|
| 151 |
-
"cell_type": "code",
|
| 152 |
-
"execution_count": 7,
|
| 153 |
-
"metadata": {},
|
| 154 |
-
"outputs": [
|
| 155 |
-
{
|
| 156 |
-
"name": "stdout",
|
| 157 |
-
"output_type": "stream",
|
| 158 |
-
"text": [
|
| 159 |
-
"Processing image shape (353, 529)\n",
|
| 160 |
-
"...Loss value at 0 : 1.0304297\n",
|
| 161 |
-
"...Loss value at 1 : 1.2510272\n",
|
| 162 |
-
"...Loss value at 2 : 1.6979636\n",
|
| 163 |
-
"...Loss value at 3 : 2.271832\n",
|
| 164 |
-
"...Loss value at 4 : 2.8770106\n",
|
| 165 |
-
"...Loss value at 5 : 3.4737113\n",
|
| 166 |
-
"...Loss value at 6 : 4.0769672\n",
|
| 167 |
-
"...Loss value at 7 : 4.617928\n",
|
| 168 |
-
"...Loss value at 8 : 5.15317\n",
|
| 169 |
-
"...Loss value at 9 : 5.685014\n",
|
| 170 |
-
"...Loss value at 10 : 6.1906824\n",
|
| 171 |
-
"...Loss value at 11 : 6.645993\n",
|
| 172 |
-
"...Loss value at 12 : 7.0928683\n",
|
| 173 |
-
"...Loss value at 13 : 7.5481906\n",
|
| 174 |
-
"...Loss value at 14 : 7.9534693\n",
|
| 175 |
-
"...Loss value at 15 : 8.386046\n",
|
| 176 |
-
"...Loss value at 16 : 8.765054\n",
|
| 177 |
-
"...Loss value at 17 : 9.139767\n",
|
| 178 |
-
"...Loss value at 18 : 9.5182705\n",
|
| 179 |
-
"...Loss value at 19 : 9.86855\n",
|
| 180 |
-
"Processing image shape (494, 740)\n",
|
| 181 |
-
"...Loss value at 0 : 2.864695\n",
|
| 182 |
-
"...Loss value at 1 : 4.0144286\n",
|
| 183 |
-
"...Loss value at 2 : 4.9185443\n",
|
| 184 |
-
"...Loss value at 3 : 5.688157\n",
|
| 185 |
-
"...Loss value at 4 : 6.3830137\n",
|
| 186 |
-
"...Loss value at 5 : 7.0187526\n",
|
| 187 |
-
"...Loss value at 6 : 7.596686\n",
|
| 188 |
-
"...Loss value at 7 : 8.150836\n",
|
| 189 |
-
"...Loss value at 8 : 8.67249\n",
|
| 190 |
-
"...Loss value at 9 : 9.186928\n",
|
| 191 |
-
"...Loss value at 10 : 9.649975\n"
|
| 192 |
-
]
|
| 193 |
-
},
|
| 194 |
-
{
|
| 195 |
-
"name": "stderr",
|
| 196 |
-
"output_type": "stream",
|
| 197 |
-
"text": [
|
| 198 |
-
"/home/deeplearningcv/anaconda3/envs/cv/lib/python3.6/site-packages/scipy/ndimage/interpolation.py:583: UserWarning: From scipy 0.13.0, the output shape of zoom() is calculated with round() instead of int() - for these inputs the size of the returned array has changed.\n",
|
| 199 |
-
" \"the returned array has changed.\", UserWarning)\n"
|
| 200 |
-
]
|
| 201 |
-
},
|
| 202 |
-
{
|
| 203 |
-
"name": "stdout",
|
| 204 |
-
"output_type": "stream",
|
| 205 |
-
"text": [
|
| 206 |
-
"Processing image shape (692, 1037)\n",
|
| 207 |
-
"...Loss value at 0 : 2.9274173\n",
|
| 208 |
-
"...Loss value at 1 : 4.0277452\n",
|
| 209 |
-
"...Loss value at 2 : 4.9098253\n",
|
| 210 |
-
"...Loss value at 3 : 5.6878753\n",
|
| 211 |
-
"...Loss value at 4 : 6.426328\n",
|
| 212 |
-
"...Loss value at 5 : 7.1218004\n",
|
| 213 |
-
"...Loss value at 6 : 7.773466\n",
|
| 214 |
-
"...Loss value at 7 : 8.408107\n",
|
| 215 |
-
"...Loss value at 8 : 9.042133\n",
|
| 216 |
-
"...Loss value at 9 : 9.666492\n",
|
| 217 |
-
"DeepDreaming Complete\n"
|
| 218 |
-
]
|
| 219 |
-
}
|
| 220 |
-
],
|
| 221 |
-
"source": [
|
| 222 |
-
"import numpy as np\n",
|
| 223 |
-
"import scipy\n",
|
| 224 |
-
"from keras.preprocessing import image\n",
|
| 225 |
-
"import imageio\n",
|
| 226 |
-
"\n",
|
| 227 |
-
"def resize_img(img, size):\n",
|
| 228 |
-
" img = np.copy(img)\n",
|
| 229 |
-
" factors = (1,\n",
|
| 230 |
-
" float(size[0]) / img.shape[1],\n",
|
| 231 |
-
" float(size[1]) / img.shape[2], 1)\n",
|
| 232 |
-
" return scipy.ndimage.zoom(img, factors, order=1)\n",
|
| 233 |
-
"\n",
|
| 234 |
-
"def save_img(img, fname):\n",
|
| 235 |
-
" pil_img = deprocess_image(np.copy(img))\n",
|
| 236 |
-
" imageio.imwrite(fname, pil_img)\n",
|
| 237 |
-
"\n",
|
| 238 |
-
"def preprocess_image(image_path):\n",
|
| 239 |
-
" img = image.load_img(image_path)\n",
|
| 240 |
-
" img = image.img_to_array(img)\n",
|
| 241 |
-
" img = np.expand_dims(img, axis=0)\n",
|
| 242 |
-
" img = inception_v3.preprocess_input(img)\n",
|
| 243 |
-
" return img\n",
|
| 244 |
-
"\n",
|
| 245 |
-
"def deprocess_image(x):\n",
|
| 246 |
-
" if K.image_data_format() == 'channels_first':\n",
|
| 247 |
-
" x = x.reshape((3, x.shape[2], x.shape[3]))\n",
|
| 248 |
-
" x = x.transpose((1, 2, 0))\n",
|
| 249 |
-
" else:\n",
|
| 250 |
-
" x = x.reshape((x.shape[1], x.shape[2], 3))\n",
|
| 251 |
-
" x /= 2.\n",
|
| 252 |
-
" x += 0.5\n",
|
| 253 |
-
" x *= 255.\n",
|
| 254 |
-
" x = np.clip(x, 0, 255).astype('uint8')\n",
|
| 255 |
-
" return x\n",
|
| 256 |
-
"\n",
|
| 257 |
-
"step = 0.01 #Step size for gradient ascent\n",
|
| 258 |
-
"num_octave = 3 #number of octaves to be run\n",
|
| 259 |
-
"octave_scale = 1.4 #this is the scale for each ensuing octive will be 1.4 times large than the previous\n",
|
| 260 |
-
"iterations = 20 #number of gradient ascent operations we execute \n",
|
| 261 |
-
"max_loss = 10.0 #our early stoping metric, if loss is > max_loss we break the gradient ascent loop\n",
|
| 262 |
-
"\n",
|
| 263 |
-
"base_image_path = './images/aurora_norway.jpg'\n",
|
| 264 |
-
"\n",
|
| 265 |
-
"# Load our image \n",
|
| 266 |
-
"img = preprocess_image(base_image_path)\n",
|
| 267 |
-
"\n",
|
| 268 |
-
"# Initialize a list of tuples for our different images sizes/scales \n",
|
| 269 |
-
"original_shape = img.shape[1:3]\n",
|
| 270 |
-
"successive_shapes = [original_shape]\n",
|
| 271 |
-
"for i in range(1, num_octave):\n",
|
| 272 |
-
" shape = tuple([int(dim / (octave_scale ** i)) for dim in original_shape])\n",
|
| 273 |
-
" successive_shapes.append(shape)\n",
|
| 274 |
-
"\n",
|
| 275 |
-
"# Reverse list of shapes, so that they are in increasing order\n",
|
| 276 |
-
"successive_shapes = successive_shapes[::-1]\n",
|
| 277 |
-
"\n",
|
| 278 |
-
"# Resize the Numpy array of the image to our smallest scale\n",
|
| 279 |
-
"original_img = np.copy(img)\n",
|
| 280 |
-
"shrunk_original_img = resize_img(img, successive_shapes[0])\n",
|
| 281 |
-
"\n",
|
| 282 |
-
"for shape in successive_shapes:\n",
|
| 283 |
-
" print('Processing image shape', shape)\n",
|
| 284 |
-
" img = resize_img(img, shape)\n",
|
| 285 |
-
" img = gradient_ascent(img,\n",
|
| 286 |
-
" iterations=iterations,\n",
|
| 287 |
-
" step=step,\n",
|
| 288 |
-
" max_loss=max_loss)\n",
|
| 289 |
-
"\n",
|
| 290 |
-
" upscaled_shrunk_original_img = resize_img(shrunk_original_img, shape)\n",
|
| 291 |
-
" same_size_original = resize_img(original_img, shape)\n",
|
| 292 |
-
" lost_detail = same_size_original - upscaled_shrunk_original_img\n",
|
| 293 |
-
"\n",
|
| 294 |
-
" img += lost_detail\n",
|
| 295 |
-
" shrunk_original_img = resize_img(original_img, shape)\n",
|
| 296 |
-
" save_img(img, fname='./deepdream_outputs/dream_at_scale_' + str(shape) + '.png')\n",
|
| 297 |
-
" \n",
|
| 298 |
-
"save_img(img, fname='./deepdream_outputs/final_dream.png')\n",
|
| 299 |
-
"print(\"DeepDreaming Complete\")"
|
| 300 |
-
]
|
| 301 |
-
},
|
| 302 |
-
{
|
| 303 |
-
"cell_type": "code",
|
| 304 |
-
"execution_count": 8,
|
| 305 |
-
"metadata": {},
|
| 306 |
-
"outputs": [
|
| 307 |
-
{
|
| 308 |
-
"data": {
|
| 309 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAXEAAAD8CAYAAACB3pQWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzsvE2vLEmSnveYmbtHZOY5535UV/f0dA+HIjCiSEI7QdxpI0jQTmvqB3AhaKWVtlroTxCQ1lprQUB/QBuCAiGRA1AYkZyZbnZXdd2vc05mRviHmRaet7p6+NUkpsEe6Biq7r2ZGRnpEe5u9tprr4VEBC/2Yi/2Yi/2F9P03/cAXuzFXuzFXuzf3V6c+Iu92Iu92F9ge3HiL/ZiL/Zif4HtxYm/2Iu92Iv9BbYXJ/5iL/ZiL/YX2F6c+Iu92Iu92F9g+404cRH5r0TkH4vIH4nI//Cb+I0Xe7EXe7EXA/nz1omLiAH/D/BfAD8B/h7wtyLiD/9cf+jFXuzFXuzFfiNI/D8F/igi/klEVOB/Bf7r38DvvNiLvdiL/f/e0m/gnD8C/vQ7r38C/M1/3RdENRADub0GZn4QSCiI8Pb4+3xsP8PbTkQgKkh8Pu5ftCBQVcLnESIByO14QQji9juCcPuPID4PA0SIAJE/e3aZx3we7+2A+DPviQgR8e14v/2ufD7udp7PVyzf/ftXf1fk84j/xc++c9Avjw3nOz+EfOc6Efnl90Uw1Xk/dY5ZVTFT9PNrU1QN1fl9U72dQ/juMOTb+xvfmZf4lT8jbvf9dl9ivsCBcP/l+w4R/u0x4XOyP79HMI/h8znmeT6fm/jOTN/O+e2E8+3Hc25u5/h29J8nUj7f70BF53lu74n86rzL7b0g4PNcx/xDBCLmegUQVXz4bQ70dt3z//k95dszB3j82UHPYX17TZ8/CuaY45frZH54G5v8ciZ+xb6zmD7vgbkMlcBv6+a76+4z9otfzqnP3/08jsAhPq/ZX87PL3/rz4zjO0P+fJkyb9yfGeu/5BJu++y7s/FnTnn7x22e5bs/JL9yq/6Feyfx7Xr41S13u7KQ2/ni2/flW7/z3XP9ck/8it+SX56PP/ML3ts3EfEl/wb7TTjxX8tE5G8Df3u+UOzuC1QEdF6sjgS5U1kp14133/s+PF2wfccFUlpJIQwG6G1Ruc8bZDI3jYB7YCKYOGoGCKOP6VST0hyyGaM1NBkK9NbRkkgloaLsbUMskUJwdTRAi5EUPNnt/itYwgxkQDok+rkha+KwZtroOAFqWDaGG6aBlYIQqECMgSSBYshwJC2IBkmF8EFXobzJcHY8QExREbx1NBmedI4NQe6MuDpiEAb5FgxTKSBzgyU1ymEhJ0HLgmlw+vLEuqzkxTi+umMthYfXDyxl4e7VG453gcmJ4ymTdCfZCVVQMkogrOxxwW5BajCovYEF+CBUaG3n0pzYOk06oNStUWvFx2B72ulAq5Xr84UR0GrDL4MeztYa/XHDh6Bm1OtO3yCi4xH0PaA3hjjRnX7pgNBbQwRGB6+OobS9ETrdkntDxkBR3AMtSgzBCBCnDyALYUFrndPdEWXOzXo6UmUgGng4gkIolgS1wWgFbR219C18GJeOrBvb48Z6eEN/PvHl/e/zz9/9n4wAjaBeG33fiQVkC6I3llS4bHUG0nBGGzAgFFDBW2AKdWtoVv76X/0b/OEf/iGaldY7pWS6O5oSKsGw+VuGIEXZt06xxHXv3B2N0R3LR8QEKZ01n7hcKkJQx473gTVn3xujdcSdMTq9OqEQMhhVSSjdBxEDk8Tojh4T5p1RHUkzuKWUCHdG778EWCYIwhgDs0QfYx43BqMOUpn7z90Z4RhznjULKk6IzKgvimBEa+DMPa+Cj5vfyTDaXAOS53gIR7QgErTeidseC1V8H1gWTBTEv3Xgw8cMWP55F8Tcl6pEOCpGjY6hiAxmVJabdx+ECE9f/+yPfx1f+ptw4j8Ffu87r398e+9XLCL+DvB3ACTlMNMZ1Rg4YA67GqVfqTl488//IY9LxpIRGE0CjXmx8jmKZaM1R5k+VQRUBi4KQ6GBM5jYK6FtJ2vCw6HojLpd0ZwIUVp3YIBOBNHdUclICkaSiZCqk1Oe6EpgOFgSrnvjeH9g2yq1TTxjWfBQogmpQIzA+07KGSxwBVNBPfAQpPlcSAhiipmwf13RkrE7oDtenbQkyIoY+HPAwdCudAFdhbh2mg/yMeM9prNdhUhOH43WOuuqhCeeP1wZd8EyFjRfYTiqRn5InB8/YVFIxcgKLALS0VDEA41AZZ83PimKMhgUMrU3NC8oQbTGGtAkkYtR6wzEbjpR91JQD1IK/FrYI9BViTZoo5LWxDY6+7MzupOTwQJeO96EdDRwIc5tjqUIvndyEVwDV4HrmMFU530abczNfFuBKkrfG5oyqBLEbf4GOoy8FKTPrWmqSDJMYexXYofISs5OBPQqpAC9BfmBMFqD7DBeoWmlXXck7/zTr/8ZyooOwcNRC+yuYFTa7kgu1H2gqkhSxh4sR8Pd6btTzxvL8YB7R5MSFvzDf/x/I2QQJ1sw9h0tyy0pC9JwIkEMI4aTGWznSrm/m/tlBLbO9W+htFEJb6CCTj/FiAQyEAva1sjFGXRkQGduSBlBdycvQO3kpIy9MqIjorTmmKXpOFXISwKUfWtzvWnMdeKCGfhoqCq5KJjQ9p1QwZKi4ZCM5o7KDWPdkK43J5kRKgwmaBNNBM7ogUpCBUYdDLGZcfY6nbfNjExM0Gi4yS0rdcYYiDseQclKGwPNCXGZTrsPUsT0cyoUEURj+gHklnH6d7L2X89+E4XNxCxs/udM5/33gP8mIv7Rv/I7qUS+/978NwBBTYPDyAwbaBc2yRzE6RMXkbQxwokbGksyN1gfM50RFMEJ5qS2PvABWbnRBIEkQyLTvc8g6B1loRyMiJ0RtzRZhZhYbaaZSZCsSFcsJWYt19GcsMWAzojAXECFEEVF8TLpgL5DechIi7lQJWbmkAwJIy3B2BOSA0szOGsIlox0n+nbmKjCBVszKj4DuQdyyCRLjOZYnkhFAMtKGx0dghbFZCLC5bTQ647lzLpmbEkkS5Q1cXi1sqwLh3Xh7niinJTD8YH7u4Xj8cS63lMOQc5GXhZMFO+CpQQIvQ4YnXQwQBmj0W2m2nU449oZ3hkx2GujjcZwp9dBrZVxdba44LvQtp2hTr0427XC1rj2gbeKeEfykfP7Z+IyCBmIGds2Hd+Shd536rUz+ky74+q06pjBvu1IKCPGLe0fMAIb0AUwSKYTSYkQMbMekYZokEu+pelpOiad3xFtYAlaRi24u7tnq8+0Hdq1khdD9oXrdkHaDjIoZeXpwxkticztGiqM5wvlIdG2Tp5+AEuJ87ly3XdyJOaMDsKFUCEGtD7QFGjJeNugJ0SBEhgJDyUfBL80uglGovWd9dVxgom6I2qgBVsM0YEMw8dcy2M0amuM3dHWZ0Dsjb43egvMwEajo6gkOsIiCRkbHSGSEjGzMemCRcej0SSRUFz6jRbNNyqu4y6IDFQSwkTPfdyQboBLoD6prxhK80DS/DyJ480JK7ds3UGmn8gKriAhDA/E5dv9B/Oe6/DpVSSRmD4n1DGMfgNf3mPSueaE2xzrCCIJGkxAiIIP1CEsIAaGMZj7QwQev/7Z34+I/+Tf5HP/3JF4RHQR+e+A/30uZf6Xf50Dv32LkMAwwieXm1ui54GMAtY4slPReSN9gHeQBVVHBwzxb9nfGDOFiRDG502YnKGCqiE+Jg/ZA4/A3REzQgN00C6NlpTFbPLEN55aU0JVcYLRhCKGdyetCgSX5zPL2UinjPhADgkZcyHvfSI/s8LpHnpvpFIInxRNQXFnXs/uLKeFbczobxJITgRCf98oXxSidcKm03eHlIUhjhYjeiBq2Oqwz+wgZyUkI+ZIgtBARmK7VJYiZJGJQpJgAj6c7bwzxrjxe4kjB9yfMMsIOxNj3c+6hQYYqBpExTFSEWJklMTMgUDHoNbAFiGdBO+Z3gxJgnVnVKEyHUwNh7ZSeyCL08fAl8oiipPwrkQC34X+1EiW0HvD+4bN3YxGQrzDUJRMi05xxdMgiSIMlkOm1kHqRqeDKKqOyESlw4URhprh0RgReDfKIiSbaFclkWsgLUj3KyMGwweKoTmhdM7nCzFkOmBV+hVseWbxzGUvlBz0cSUfCtTAa2A4JEgPBY+4uenJq+6P20w0zLAeaBE8DHojuiOSWUrCaXitCIIuC6eSue4fcQ1SKrA5KSdYlXFtlNXwfeBbxw6Cm2JZsWSo9Bsjbuxj8ssWAE6LQd8rrsGaEzXGjatOM1PDUVeChrpgEuwaaExPKT1wAtdMSQLDcJ+B3RnIcFK6cfuiIEqE0HxHXG9IuUAfhDaSBjEg1CaFOBRXwfJnhlrxCETBnQn4PBijYSIM7/gwbCjVhXVRAkUl8NEZlG/RcwXCDXMnZZugMCZgdAExUAlw8O43pmVm7zK5Bya/A7jyuTzy69hvhBOPiL8L/N1f93i5DSRiTOSL0MSxrkDHAzoJd0PoqCjCSsYREi5Cj05KQnaBmMUiNyGZoNGhwzEtXL1hqoQV6hiYgiW7IfeMELQ00XeTMXnCvSMpMRAkzYnQoXStpNcr0cF7kNfp5Ft3TseFXidf1m9pE+7U0enZEE+kBfo2iJwxdyQ73gVM2OtOdpmOWpyic8HGIRh7J+ogLY6+Bn0cDJ/wz6+NYsJ4HngqqM4rGwbaAklKSkbvHZdOix0bC90rJSXG1ulLIm2ZfF+gQYwdfC61MhJmZxiDGAvKxucVJyWhOUiaJifsjhLAwPHpzB3MBJEOUhhaITVKF8wXdmaa3hYjtNMzpNhJuiL7Togi1qnrRmngVggKjAFtw2MgkQidyNrWRHtuiEPKCZWCWKdV8G2nmFFHRQksFawLe+u4GlgmRgecJQd16wTOclDiFgQ0gr4pQx3KzAA8Lvg1o9IZJlhPyMnRDn1rHA5GJKM6WDdCjYfTyvPlTI8V2o6Z4TnwNjndJkHbB3lZ6NXx2lBLSFTyt8HTQJ6IOKL5jr2/J9ojllaUmbGqOtfzM3J/RMaYaLMEbe+kLZFLpg9H+oBVUZ8CARVn643FMqsnthvnOzxQd3YGKpMO6dugiRN93PbyRNtJhBSBewUxGuCtzzqWKHKbA+mDqIMajuUFFWZWgIIqFmAMhldclOwGEXSCrW6QMykZ+16xuHHOtwBoLmQ1+hiM0YhS0DBMddIsCYbNmkjtzpIKngdJE3t1TBxXIAmxVVSFIUZCJppWhegz2MrkvjWUMW51KAG7FdNN/FstgxMM/Vw+9X8r3eC/t8Lmr5pM3jp8FoZEbuhvqizEFVQwmemHqOPDZ0Q2hd5JpdB7xRQkAZ5ICm00mGUKtjpAwJOAOnpD72bK6I77wMfk1tGJShmCHZYZ6ceAIvRzQy1jh0R7bOhwyrHg5owQ8MHlqWH3eS4cATVjeJCy4GPm6eETlS0mhDoegiZl9IGH45bQ4ZODj0kO+dYmUjxkugfpY8eTggupGFpAyuQ3tRgZo/rA60CzzaJdDTSgm6MxEUAMofdZ0K29E1mIpyDujFJnQciH48eFY8mc3efmMGV1AZvcYSYR0oGMaboRWnOONSaizyVwbPKjPohseACqM4zaoI9OiqCkIJVgj8Z9fuDT+YmhQfaCMWg+qObYEkReYN+oVbEhaDZiV/JdIXpnvwwUp22NtCqWjsSls5Yju+x4V7QnkgUundgCM7sVuDoPD/dctjOdwLpAmuirj066SyRXWlOs37K08ZZXr+95en5POi/4MdCjEBJ4KBRn9IzLYLRKOgRcoQVozjTv9K43FCeYgtdBDqebzfp0JNCg7h3CWZcDFw/cn0jAcnrN+VrJ2QjpjG1nj+DgA+mBDiE0QRbcfFISe8OTkpaCb7MAPPYgkkKZ+6iieAyg4LHP+scwxBtaDK8DF7sBzMCK0lsgY9YE4mD0NlCmmEHE5vkU1IQWdiv23/aogFgQdWbLbhC5zDQTx8otiCVnDKAGavlWTwKLgZoyHDYCxLBDmYFMgdHIS6bXyjAhrSupCZYKYztDOKODHGzWdfRWDU+Q1SfYUWGI4jqmQKlN5C8SiBmjjVmPURjhiAgec26nY4pv62//UvXZv8J+K5x4wCT2Vcgyi4kielN4KSCI+0wtdaZNmDFu1eZIDq1SljJRYXd6begCLoY52FqQ2mjhiCwQFUMopogEXaENJZcFzUBvtG2QkxHdb1RL4BeHJaPLjZtflKxGBJRQhgpxV2CHfu0QyvqgtEvj8HCg7xUrBV2Fts/Uq+2dlACbBashCiOQbEiG6FNFkU6JJWfqCDwGpoKsmWiOqWAWcFFqG9id4b1COoEFORL1UpEMdiwc1jtkv2IyC4Yhg1nOCOhKu1Ty/Upsg0tAaxf07sDonZQSx8NAzBlDcHdUBLnJAEmFbLPibj5TVQa4TLXArBoEowO6UMQZNhU2A8X7LYA/JNgGrQosheob+WEhrkq4UPcxA0YeExlewdaFosbIhn2c1M5A8ewsD0p9dtQKsQe0TgxIPehJwBIxdmQPsiljMXpt+NnhGGyPHylrQmVhVJDopMPcgFGdDUfKghoM3bmMil/H5GOX60R8WfEqUAfCAMnEtZHvplPwLRCm2uH0sPC8V8Rn8RCB6D759tZZy8LzduOIU0aS0i87SzZam1TcvjWcwCTRhrMeF2zs0CbKH8z6RVZDpXL5tGGLoSnhVUhJp+LkrrA2Y2Ul7oP26Ur4gL0RXQgWfFzp135T6QkaNjn60SdlSaYvCZGMFcX6jhFETPDgInQCiVuhtXaIRDHoGI6DKXajM4ggqSG9s7lS6GSB5NCZmVIdgYbgHhCNvkzHr5IBQ9OAUXFTxujEYiwh1MuGG/i1kVOZfEgaSAxoikQjRKHdlC9pMuehs761nzvr6cBokLVNH2fMOpsxuZabAxdlKq241ca+lYn+evZb8+yU8JsW9+YI0k3PKTL1uhqfi4uzHD4iUBWsXyiaSEXxfcdrwxbl+PrIUhJ5+ER6faN6x0qmt31SuKHUIVy3YOzOasaSZxFtKULSQFOmRZAOGQGGxtx8DCyCeq7sTaZixjNjBznPFFyXTPhAyeRS2C/bTFevO3gQBrV2xujs18qosF9mtqACJRz6rGK1c4cYXD5sSAinuwR1UM9TGmmHWaGP5GCD0A6L0vsg74rvQjkm0kNmLcJ1bHit1OtGxIbdCbH4VDSE4UPYnoP97EQbaHW2Dxf2Xnl+f+GyXXl+3NnaxuV84bptXM+Dfe+MveKjUdtOxwmdUkctE2EYTkRCYyVjZAqZBdXAMpSTUpaCRCJLIqUVNSX3jDU43mXyXSZnJd8nygrHU2Z9lZECRkLPg2SKpESsiqYD2ELSjHUjrwtenLTITNGzEaNNhLUa0YyxDXQRdC1YZJBlot86aHsnygJ7sOwBzbDTgS/vHsgmyEicFiXRyT2TPge4OvDm9JjZ1f3pAHmhPRr9qSMCOSeiX7j+/IKKcDiumAtRleiVoJOyTR62OhKCFZuyvSpT4YDTxmBvHcPwCLIZl/3MAJIdkO7E4tyfCn3f2a9X8kOGpCRVvHZq7aSHgg7HjoaPSt2AZJNm9D6LjfsFy0rklZGMiIZqx5JO99srKQWjPXPIC37tqAl77diyUD9ciD2mQMEUHLKmud11UnBkyNanJDTJVJO0IMqKqTFMGBJQEpoSo8uU9GJYWfBc8D7QlMA69E7fB6pTYuzWpqwxJhW0uoNURgyqbsR2RcMpx4QkIakia6KXwAViCKkOkgvrsdAuDTPHEVprU3mmMtVxn/1ZAB4Trd/IR0Jvmv5fz34rkPjUSQ/EbUr/4nODw3TgIjILWOZEM8wS0TuEoLqCOGPvrGtmq45unWaN4cKik2/qXTEF7Q4ovXYOd7PwuFUnWUJU6G0jRuWyr4yAaI3TCrV1ojnLQ2G7OnkLyptC6xnpTt+nxrhnWFVJizBGR+6M/ekCKbEcEvvWkaz0vRJLhq2x3K0MJqIVhaUUSlnZtg3fg7woshbqpfHw5Wu21jj/aUNPBRmGkqifNvKbArEja5CT0q8LVAUDO0EbwkNOPD/vXLbgtK6Uu4AhxMWxRfDhLAvszRELeoP+1OnWOKyF+OhclrmAx1OfmtjTYPSgvwoiCt4X2sFJJZOHT729GUZhdGHUC2U1RlaIib4xpaQN58JgnYUSTaSheOpoMWoIculT/3w18vGAtkoPQVNg0hGMdgjKoRIjqGchzlPXL2NnWOa4NN49fsTSkZEndZRCia0xFNKAOCSWMvXp2jrpOGVi7TrQpfJX/sqP+Nn5K3pjUkm5My7Bo3+k9jqLgCSaXkn5iXJ4Q98mr65WuP9eYf+wcb1epnMo90QU7NjxDxfSsQA7175zHZ0sxjYaqhk9KPujkFNDl8HASb1QDkE7Hmlx5nBcWV04X8G9onRaE5ZSkOFcLxuHhxPkzrY9YyvUc8IoaOlsH3b0mCgpkArlmOkNYtvZEmRXIiW0Vfa90WtweJVI5539CodDwbcK23Say7ogJbH64HL9QOzOcl+InCCU+7f3RG1cGxAdSzCkcFAnWmPXhtaELAkd4K4sBN0rbRgZsNbg4QHfK9LBzfB9m6KAqpCD1I2iMqlTc2Jd6J5Ru4O+Ufo2axWmbBKUhxP65Ggd9ENmHxveg1EbKp3kwpIyI8BxhiR6dRKD5ZTYrjtLTpgq6kLIdOA4iHyWE954cGy2UygE9mv7z98KJz7duBEyNdJxc94ut65Dv0Ww4RAdJGM5T81ma3ibxee9wtu7wrVXtnNnLYKlzH7dMU2k48K+OUMG5AK7ED1YD4XwoLeKpcThboFQ8kOhXTeeK5N2Kcb1vJGXQu+Ov79wfP2a509nkhZyfoXXTl8/Qg2Wt3fsW4M1IbXj+9Si9hbkNGkecmKEQR6YKCbGGMG+X1gPB4Y1okC7dGwo22Ol1x29WzDt6DKw4bRTYWzO8qYwTk4SQ57yXFiayYeC9c7WEzKcN6+PPL17ppSE3guXx0baHDVli31yeNGxOyN2sLEw9oDREdt5fBbKcqS/f0Yq/M5f/iHn52fq1nnNrcEiYMkFXQ2nz8asIago1YMlpUmjbReGO3osgOCeSHJFSif8CHlj1EBfFXox9JKwh0FLA//k5IMwNue6JExX2nZlD2fszuG0ouG0y9S7R1JcrxzevGXsitkZW5369VRRJG00EWYtTCevewzGPiA56xpgxs+/+YYuihEsS6G1SimO4+xbcEgOx8EaC6NmRAa2Fu7vE5zA9oppZQwjnQTajvRM+2bDHox6aYQ6SRdYjfZ0JSWlD8GvcDhNJQQONoLRB60Hm12xAXtdqZenWfuxRBtBeTiyP12wspDvlShTMtu2gmrn+CBs153eO2ktqBp2XKlPne6zz8GWBdsrlCNUZQQQmegV64J7Y8lGH4JaBlMuMdd2CYhyQL3jIqScCDa0OaPvVDeWU6FuQkgie6eHsByONAVGkEflug/s/sBoTngnWcGb0+rOet1BEpIU06CPwUWVLE5yxbPQ/YyURBmvCfmEjzrHq8qIztUzx8PCm3vj8nHAvuE5sebEiAxtYHZAitP2Dhch3XWoR1K+ohYzm62DxBQ6pGQMmZ2twk2pcmMZhs8ehcGNXnFmofTXtN8KOuVzq/ts8pvdbBLzn6pKsjEbAASOy2HegD7otWMmaBLuXq0s2tkkUBKHtw/0/JqLGlIWvCR8a4w2MFNMhVYrZoM6BvV6pktlazscVkKV7eMTqSfu4x7bCicSKokIZztX0qvD5ExHp9TBOP+X/NM//vtoFpo0ZNlxoO9OF2XcOEAZ8ykD3Tu1d8Z2ZTRou9LboD/uXD9uvP/4hDPYHndOi/L2L92jayL/7pHjHSxLIZkQ141jVsrxjl4zaTf2CySdxeJ1BXnu+KcgPW00NT6cz6Q3wvlj5d1Pg/t8IqWFrMrxfkWjE31n1AvWzuz9Qt87betcHzujN8b1yvXxwuPzM//sT/6Ea3eu18r+VNmunXrdaD7YnirwhFDJpU0pYoJgsJA5rcJxlRtSXwgO+PMR2RfKfZCLzqDXE3KZgXkpSo6g3K1YOpCOhaJGiKN3Tjkpy6syVSpqpNdldgGviruh4iz3IEmJvZBYyWVhHANbMiIF9g7boKTjLGA+zUImovTayHswtqB9bPhWeH4Url8PHpYFXQtZDvShDAyvRnrqjBqMj1dcEj33KacdBRmVJI1yWiFkKkSWleXtCa7KkhVfpnqkfrjw/NUzu0PokVKOk07Yg1VeMc5HbDxBypT8CnGh1UHdGs1nUTYvAx0yMw3dSIuiJNbXK8d1hVtTS3/X0cOUsLoGI+CwFEpUigU9Z3w/g0HzRDCbfhZkUo8lOL4xUMe1s13OwOB0Z3hz7stK90EVQ4qyPb6f3ccOOSrXbtCcNTvdr2wRxGL03man6N1xFiVl0qZ6WmaRXHwKCQ5HshZGJHLKmDp04+kCX70P2sdg3yAdFWrl9OUrDndHvO30p6fZjJaFPRvdOtu+c1nXm5xRiAxkJUZF2HEZxL0QOiWEdliAGUjFlaUk5HPWLUG4zD6TEPT2+ABnatd/XfutcOLCVKNYyK2hgm/5I4DeffKpXdi2xmiOG6Q0LzqnxPPTlX1znj40Hh8HT+8e2T98xLf5uXWFNIt9w29qjONsPmjPg5EeeHj9Pd588Qq9dtp1Z3/e8Ta5b+uN63XqycvdiXSfoe5kvZI8U2Ph7R/8b/yN/+xv0p/PpGZcHqccDRrRK8hCq46EkxQIQ1owHO5eZ+7eZEBnQ8wItA9GDV7//isuT53zV1eSNN797B0uwlPd6bpyjcT13c51f0bFGPssVF20496InNkvnajCc985/kHhd//agfWN8PDXTvzeHxz4xc8uPH59YVyEcQV5k8Ez+bjgupCXZS7M68APC3Vv9Etj+9TZro39euHy6Yl+rTx9PPN4HdR957xtDD8TXBhPZ/a605MTZJSF86cnru/egQ5OZNiMNIJimeP9PSINK4J5JnUhnxKHLzMZJ7uRs7GWRNKVZNPZ2bKw3q3kLMiI6RgryJ1h6+Dw5o78WqiHkUk5AAAgAElEQVTPZ3RX4jHoErifWU8ZrUp93MiHjK6FOGTufmfl9KBwEPYmQKGLsiwZWSFL416V45uBLYHWznU749U5LZlWlJED2Qb9XHj6pjL6wv7otF2QWCYw44psg/VY0EXwfVBOg7DCYS0cXmfWH70ivb2nd6VVn+vXp568bxfSwfkf/6f/mdGd43JhOCQMauPV3T26CmLCdd8xLQR51jJU2b658PzNhlpB2kDfCOshIzpYsnHICUZGhtAuncOos+Uc4fLNV+Cd5opLkEXBjfGhkbYz53fPlPs7jnagPzf62Hj/7hOWE4clszicyh3WN/LeSXeF9QDbdkafz6wVTBYkJbQ5se20ayO6cdmuLGEkT6g4OYTjklE1DofJNu/vH2mXwGwhvb5DHxIeieVuoT03tpR4fNr58u2JdcmcawAbYTtlXNm3Tvrein+64jronklpZSlQm+HW6W5oEyTPQm3bG2oKMh83UPc+m5BkqmKGT0VQ8FmdM59ZFP8W8pTfCicOE4XH7HaZz8roAxWh94bYwj6CbPnGNRrJFRGhD+P8/szhkHn7ZSGlxiFt3JEoEZRSOF+UwylxWBJ5DPLRWFNiyStXuVIeVuS60ZtgW1AvO9TO3e/9kOeTsn9/5e7Hv49zoT5Vjtko+Y7Hx0ovT6TjxmX/mt5+QTr/hGU50ZfKfoV8zPTq5NpJAoPE/cMXhDzwsBz4yz/8McfvK/3xPVw2UjLuXhXchf0XjfL2yPZPnzmosa7B06Xyw+89cPl4JSk8/uJMicHlurFegr4rd+mOfRukNWE/eEX9+ROeAv3hyvGNcXz3jk9//IFlf4Rvnhi/+MiXP3rF2+/fcXxzQnMiP4N7wveMSyIVw7Mhrw/sl0argT9tqDh93+ht8O6f/AIvg48fHvn08/c8fty4Xq48XR55/qhUXegtQ5upcrCzvjLuHpRGJeP0gJSd9UFZOJIp6DiQLHN8CPLrldQT8vpEsUTqji1GPhintwuH40opBb+C5kK5y5xCKWtChlDuVnpVsjvH1+vU2q9OqLC8OmCsnF6vLPdTSaI2n0/SP3S2p8ylLWjN0B0ujWvvbALxkNDF+I//o7/B49MTj2PnoXTWV3D1Zx7WzjU6/fXUCosrcc2o3JPLCSVTr0HNicOPM+sXK4WgtI39GbpCImjvNnxsvPoic58Sepc5tyAdg7h+pAFC5b//b/8Wr9eBf6iECHf3B0SVkWabeDs7Swpq7xxOCz6cXje2y47dZ0SNkRM9GiNXKJleE/tF2c8b5+eK68BjpZcDdTjyvQfiCkbQw1lyQiXz3Kag4PT6ngOwf3ikSmeYkVKmnxuxFvyQeVxWfDnRAy7VCG/o6zd8bMJ2LLCA0RnA6c0B6mC7nslr5vT6LVuvjHNltI0CnBblcCikpWMPd8j9fI6R2ZX7/QNp7TyNC/X+NV+8fUBb4/H9mfHhyun4wNUX3nThMIzU4HSBH55O3OWgnzt5gLeBLqdJTveA8+BOOikLoonRBqPDqI7KFGRMJdbn7nGfjwlwpswQ/q2Q+J972/2/i2kqUe6/nO2yYnQfGIFauvFFmRaVsgueOpEKS8qICzU6cr5wuj/xeL0yHF7llXK38tf/wx/xJ//gH/Cz9wFvXtPXhm6NgZDD0Df3PJcz/PRKKneMYbx6vfL8zQfG6Y6o8PC9jd/pX/HH6QfYB6EuxsEKD188cDl/wOs32P0Jba+R6x1b2ugffsHpD75HCeHp3ZWclLgK93d3XBbI/co5Bms+cHhzJPQD16cLUg3snp5X8lY5Pw1e//jE3VqJ0z01KvHcSUOpsnC9NB6OhXpovM4Hyqlw3Z30e685HK5whlwUf3K8Z8raaIvxdG38zo+/IJ6eaKsj90L+0wJ3B/zrgf/+zlKN7b0CO7UJ9pCRYSQa7SzIJZHedg5fBnr4EtmCL7480i7C2x8+UA7Km7tXnN4sHA38dOB0eqDogWJp9m+eBFLmRKYA/dMFPxVIgrbCF/kv8afbP6I+AqmS3yoNZX//RPVG/KJS1fGc2ceYqiKUj89nvA+2540ALr94xhfFd9jOZ7Iqew+uT5D1mbYHQzf604WxJx4/nDk8ZLbHjh4XdHf61jhfne3T4OEL43reObwpEJnlFfhXz+zMtDj9eEUq1PMT6e4V5byhP/JZVLQ7aoX2k5Uv3rzhw/Zz1tfQnzba0ok9c1g7n35yYXwS7IvM5ZOjZ+eL3z/xk5++J+2Deg2WH57g05nlTrg+Dw45OI8r9Suhx6fZoar3LCJsEXzv+4rqiedPj1iZ3Gu2yno88MkL4+tP2OtXjPcXhgnrEZbvC1YW0tMrHv/4p4y76ZQ6+wzue/D8uCPSeCWDp70j9wl/Fk53D3z1+A3raLz9/okPl0Cfz7z9D16xfwzeff1M7ME1V7It1McLCztf/PB3Obw+8vXHK3Y+s4tz98U9sRmXrx5Zf/wGV8jRMQ+ePz6jpxPZDelnRg/SVci5IYc7eFX4eNkY0QgP7l+/JTM4/+wbRL9gvIJcd+5K5torz9fB/esF6eX2jBth9DrrGOcz+S//FcLe0T8K9fKEnIxcfodP777idHeEKPR0wWumZKFfz+jxQB4+u099OmiP+XA+jykRni36kycPBh+++umv1Xb/W4PEZy1TcXw68JsUzX22HqsHXhRbVywUJehtZ2xOs5Wrz8LO4c1btMBXP/tT/o9/8if8JN+x3s8nDt5JsLdE40DNJ+JceUUiaeJghny68vyzJ5bUOX6xcjgEj3/0nk/vXrH/UYeyzHbctND2ji93tJrYvtnZ9sq5v+fpm5+w/+B7PNy95v1PnynliPaF6CeWurPUT7z90ZGlC/7hzPOnnXoWGCv63Hkw5+P/9XPWY0YfhLc/uOPdU/D8sw+Mj4N1wMbAZOOL332N/dW3vOEVrQRfv6toZPLjoERm/N5r/E82fFP0mFh//Bq2xA+Whef/1/GvjtSfF45hHH9w5PqTnVGEY75Hf2DUp4Gdjqz3B/o7h2udaqAI4vBEeaWIZ/reOL+7cH26cjwE23bh8v7Cz79+z/9H3ZvsXJZmaVrP1+x+79P+ndlvnbu5e3p4ZERCpqKqEgEqiUGJK+Ay4BK8IhOVmDHmIhjCACFAkFVImaQiMyMi3T3COzNzs7877e731zE4lsCMYIAUsSdneqRztPb3rbXe52m6gYPTYDX9MNG1DfVxh45HEu0pkIxjj/ETQStSncEkUFHEg3lFJDRRYk5rXWSo4CizhPk8gVmMzmK0lmgvyZKU0TmyKEXFMZGOydOYeFEQFQlRJsirDIRAKUNxfkrJYS2udhAVqDJmdrXA6gQVFJEKTG4gWVUIa8Ebxm4gy1JSGZNnkvqbA63XyNZTLGPkUWH3nlTPyJ2lbxWPxEcIq5BKEieaInV07gZT7+jeviNMI2NjiYbA4Q3YNuLs6QpsxDyT6AK++eIdhZDMLyOyAoppQk4SoxXrMmH7xnK4UVTnGrJLhJphoowWQxEpZtmSYBzCCZxxjHULHdjWkXtJkBm+npA6QuYRxDFeOkysEdoidUS6jNGRQosc7044CBdGDIKjzkjmJf0gUbOcYFoSGeiTGZujQ4SYOIl59cuBV18/MAw90Zng8eMFysH1Hz9n/vFz3r675et//C12anFJSXqxJvQTfrcju1CIaEL4AdtNTFaSX6wgOKxwxCpiqSpimeCKGS6Cze2BIktJREaZZBzePtBs90xE5IlkEU9EQ0ctBCZYghjpDi3dwwYaRyEUsfSM7YhcnRGbB6wXTDImVBe0bUxz3DNfr3DdhBwbCmeJ7YQfLUGehqGjPdEM/Xs2T/weAS3fp7GFPOVRXDitLP6uz+9NEf8nNLLw/3SNOBH/hDrFVoP1RFK+3+w4oSiDCKSFJJGe0XnQEUopJlVQrp8g3xiEhK1KSXNFk6SILAM5YrU5cQtqTZLN6CdHVOWk5yM2jpgGQ7iVxB9+yHH9EmYlU+LwE2jhUCfWHWRLoqSEIFBqYn6+IlYJ9WEizST62FEfBKOU9K1i/u8t2bTfcEwMky/JxhYjPSIExsZRxyt+8p/+hEE5BJZ6rDHpRMgE5mAwkWO9fI5+cYkpOlahRj+qmf9pTvlEv0/LBRrvSR56wiw6nT5TgbeeiI44ecrFU82ff/Zfo22J+WpASo3KM7LzAm8kpZfoD2LEQuGkJb/IETH0/QS9JZnPUEXKYDyiH0hKxWA8717vaN9ucKnBpp6uGRjGju2+pd4dGeoj+tIjoiPsHmh398hkxOwMUZFjsdj3HGvsSCQcMhUk0mKaHd2XG6agMZMnmSfEUUEwkrxKEOLUt1WZpEhiskV+KkixJpcZzkhUrJBFjA4as51QkyBfzymvH2GDIjICZyyJtFhhsQ5srzAHQ6wT4lSiVELoJ5p6oP1uh3WBeBpZXK3QUcU8iojHARf8CVVQLnj9xbdIF6HoiSOJigLjQ01cRqQ/qoivI66uKubzDK0UVqQc6h5vLJP3+A6Y4Obe4Q4e4x3vbjuchO0XtyTPYqZjz8V1irETs/MCOxrW1YRuO2ymGCdD6gJBTgjvmRUz1OUFPispI020yAi1w7mRMj3NodwokJPDzxXZx8+RoyTyCfW2wx1aDg8tUsWYYBjtQD068iTFBoePPZO1xDjWeqB4s8EphTRHHr884+mPf0K2fErpE84frUhNTJgE689e8vSzj/DjyLS/gTGgnaNaF6jFCt04qihnUa2YFRnKeWRWkaIodYW1O8p5Qm8nDu2Aay2ymygLRb8fyFZz3DxF5zlJ0nPcG4bFnP5QY4Uk1gmZjlisMorQ4k1LmALZMseWUEea3QbmM42eaopYkrqecewgVhDH1LUGFRDTiSfPKaOFIqCEOhEo/5/JdDj1yv/JP/CHtp0CnPgB/vTlw/vGviOcPq1DJ9Fp9UlapmminzwiVTTDhCPg2gAmwaNIlaO5+5a8aHFjD+PAGEfEQeOJoRHI7ki/99hOYMYOlUjKBTgviZTHPjwglgmlipiyW7JnK6RRGCmZJoeKYvTYk2ent6gKA5Di4oLZOGH2BxIt6bKC8smSLJrYRhOHB400sHANZnxgiAqSFHSS49OC5qsvefPLL6EdefrHS8bNRBpilFEs4pj+PuLulcLcF5S1YLd9y+g3CJsQHXaoeGRUCenkufnrd8z+aIFPYoKyxEtFajPG8TVtB//7r/4LijM4m605bk/oTztOuAeDUxX67Uj/6w6V5wx3ligpkCohfbRkPAi6e49rFW6C0E70N3viUuNSj/EDdjBMjaW/Gxg2B4Zdw1B4Dvctb37j6J3E6pQkSKJ1Sr3t0CSkdmRoe8RkCLEgTxwTIIUkWlwgkpSyWBGH07ZBsTytP6ZJShwnFHFMHMfQGuIyolQatGF1kROtBMUsh9mSPs1Qi4R67Oh2A1GaYnUgPUsZvUXliqmvya8ShDbIzJCEQJIJVhcZ48GRProkKQomCjaHkb67ZbkaUH5CHCb6IZCnPfnTObtNTzfC1kcc7mviywVRJeG2R7oIczNxuOvwMuJ8rvBNT72BY+1Yfqr46D94zp/8bMbsk4rLT8+ZLSTb3T2ZjujuIvJ8QrQTbefwdqSIBb6bCEWC2dT4fsQmp+HgpBStGXHtEeEVVVYSecGsAKkiRgN56kmCYZbFiNow1W8YuolhGrm61Mwrhc49+13No+sFOo6ZVYIkduze3lMfDdqnp8xBVuCDYugg/dFTqjRloR5YJi0+jMz7PWLYEOuIxyvNWB+Jz69IyyWhP3I/OCZjCIctPokZeoOUBoehnKWUTCSzhG2oKVcLDtKxfvmc82dX5B9cst93bO6PzIpAJFvCdqRII0JZ0JuRvu05O1+h/PsB80wyOU3+5Bo3tHghqaUkCEW3B9d3fP+Pe0S8JIkLjM5Ie4eaVzSTOhEqyxQRDELFaG9P1NFgcdaCUKedPHXaViGcWnFCCE5Y7t+9NP/eFHGhBCpSSAlKeJAK7wORc1h9CgRIBcp6VBoI6cgQOmSuyWLNnB6ilq731FNCOXtO1DlMPaCLOS6y9GYidQPKehZLibB7hNtRDoroLEXhEZmgiQ4gwY4t23dH/GZG/0VHPk6kOBI9stsdT7MH48mLnFHOCESIfkLGIJVjOIzESUxsazIvkeZIfezp3kQwabInLwluxfFB4bXm0fUjFh9csfrsguj5yOHVhiG0yFlEEsfML0qmSRHSS7I7wzeNYO0uKaZfsRHv6M4S0mkiNEeEcyyfrLj/3qNmgeNmRNwF9DOFPXrS4R5JT6QGzNVAMA2pBL058dpdP6BmMfOXc/zekVzFmN4j6gnvAlKnTAcDW4fOFMd9zVSPOD9SZh7XOFw9cP9ux82rhtG602lwGOg3NZGaaFKF0Ue2g8IMIz4W7NvAOApiL0jSgQiNJiePJYlOKS5K7HcOs2/RAqrLDGclWZxQekFmBaHIiLKU4qwkmNPAUgpBlsVEk6D/IcKbPU9fpogsQY2BsAyk1xKdRAzf9uSpIFkoci1R6Wk1cZwSkjTl8aMrjC5I1/JEVhxH+jc3JEDCLdR/Rx4M/RhIVcQwCVrvWawrrHNU1qPOLxGRQoeYMi0Y7zxjiBgdlARubo6Is5In8w0ffChIqgWxMhz3W96+euDh9S0Zjhc/eUL++Iqb1zuSl4/pB3UCv1l7itjvLbo2pDoQFQqtAl1rEHc1fXdg++o1xgXe9obDQ8u2VYh9i68UYxhJswg1ddjRkGhNuipp+4F3raRXM+wYeHaxZnfreXK1JE/n2Ebx2ScvaNoNcqZZZo67zcCYaKrH5+RFQqckX2wCN28Hep8zLM/ABQo/sL8PkKZ43xKfSzqhebaM2UwxQc847B37vuXm/h3b4wMPbx5oR4MVMT4M3I0KEdXcfHPk/maPGgL6xQKfSKROSR4mVouExVxhnCcyE/E8w4iRZdfxJF+xzB7T6YyHZsKlF9hEkKuMyQHe8vSiIDlX4AasEFRZTk1K3B9QtgNjCMIQZgoxgPAKqTzIQCRPCfUTk9hACAj5f5u4Ts2IP8DYveFkRDkBIk/gqwCIcOJf4E8x5ZaE7mGCO40aEtI+IfgEuTQgGoLdMGx+oO/3HKKMuFxh45x+a3BbhQgSkcf06RMuLwqsLCiWHjts2LmEcLgkknPSMsb4PcgH0nCguurwpWRWxhgyRJKQZpLUxWy2jnkhyJeas7Mc4tMuajKryBJotgdaDpR/fIY6dKTPXzI++lPKMoOl5bw85/hqz67tMShMGLBe4JYCcbBkR0FaRvg3r9HNnqr/G1Q4kvV7Dgz42TVlc0v8YJmCRkWaqYrJPsiRbUsSUhIdU+eBIduhr3vSxYHZj16RXXoOe3BihvIRYQG99UTvLTZ6MJR5wnjriFYJcpniJxAMqDiiVY7hrmN5VhFXmjh49tIy3o9svh1RpWD+RNBpRzceufvVlnY7svcj7d9t8JsB7Q5M1rAoLX33QHdX48YjxzjnfXAfyRyVaOwwkj7OCElGKCvsaLg6KyhWGXKeU6xzdOPRkSBPE2bnOckiIUlzonlCTIBU8vJf/YiHLxvW5wXVxxWz0dK+NhAiypcV02akSBSTSlCdR0wSbQwuDfhUsigSOIwcvruhq3vy84KH7Ujztxm7byVTdoHWMXkOIXJklWW435KKHDVohvuB0A3MVpJuAJVoZIiIs4i2mchSGL7fczhPGVXJ/tdH6u9/YDbLqKRBjIreCG6/3zJTntlMYA4jVZIgewU6IILD6UBWBoTxbO9rhNY8+fSM1Ys5m9sdZ599jAgOGQKpPG27xMWM2e5IeTCMw4BRpxlQpCL05Hn8YYVqW/qD4/Hjcwpd8CTS/PavvuK3/8c/YDdbts2B1eqKvgtoF+i2D4inBQczsuk0b3ZHQlszmommPvL62x2Dj1BKkpgGcegJuoSjB++wZERZzvE4MrmWKC/JFysyoZjGid3DA8PmlnqQHOqGszjDJi2WnvrwA8XyEXkSUzcBpyTlocfsO2zf4OM5WSI5POw5CE28jvnqux39qHBjQ7KURF6gTUfoJ+ap4PbeEolALlpm+YTPAmWmafpAkieERYnfNkglMZFnigQ2ihi8xklFpCDKEpDytDduPF68x+lK+R6w8rs96vPPP///qSz/7s+//vlffC7T4oScDIJISIx4D7jipFrS8oTv9GYgRCnCgB86snTONHrELGFqW6Sd8yRVTNFIyGbYRBN2DbJpWZQjyh7p+oj0uGUfSUywHB9SFtcrMn9gbmKmVUQcK0yUkMY5/dFQVhXHEKH0RGMjKiUZ9i1HLNFMcvlHj3j7zYHNd3uiKDCZniie0e9bbCVRWYnoYkJZYN61LIWlvjtg6yPjccuoIqpW0pqBo8jInmacFTOGduLm6y16IWg2rzDf3GOTklb3aB3TlhFFdIZTEW7MUEdPMYdubEhzhXgYKeYRUsa09ztMiFmUK45opsLT5B5VLUkXFW47QKy4/CRnb3rSvKLxMaIbKS9yzNGgeosuFTqVaOmJYpCrEudHlPKoAnRsab83RD8+JzQwRB63GVBKoNyEfzlH15bjfYdZVsjJkS0dhztPlKfoVUQXK8oowaFwBrQ60RbTqCCJTmq3BE37Q4/KBvzgSGPLGJ8kAYmX+ONIHkeMh5HqvOCwbSlWJauZYv/FHbJISMqE+ld7kusV5qFBPc0xvWOxmiFETFAnHISQhrRQyDShfbdlFBGxg6A1JtHYPjB/UlJrj9qX7NKST/98xW7TomNP86Yhe7JieTbj4W83lKUhTi2YjrGOEKsKNxwYJ0Xf9ZDB5UdLEudpvrjDNhOFyzjc9zi9wlUV8+BI54ZhOHVQ+76luTecPU857CSLQpDhqHcGmUasz+YnuYVzCJVx9eMPGR46kD3ypuWij6hFT+I0nZaEaSJfKOLYUubnHIygr0fqH47E1Zxy6OkftgTfczw0zC6vuX72CTa1PHx7R7WakacC3z1QTxITlyeV7uZbCBGPr8+JZcJ9PxKnC8rJMPqR9mAoLiqS5KTWM9OAnqf0XYuZLMtHCTrvoW9oxYrrc0EccnoR8+TxnK7f0XWOSEqG8cjq5cfUNzvmBYz+SJqU3G5r0n5ieTGjGyAeRoSMqOYKH+ds778kHF4Rzs/RTNA6ZvOI4dCAzhjNSBZHTElFf7NjeDgS4gjVGtIkYO1IFCuSRDKOIKxFT5Y8ivCTOoW5OJmphAStTkILKTghnAX0zeHd559//t/8v9XP35uTuBDvF9yDxKMQJ0j3CScpJU44IhGReA19S0gls/UCnVrww+m6slLMkxnOQjdlRFWJaxoY7ijPYXecCNefQTlx/izlxfUly6CQbHn47QbrS+41TK/3MECSOUSzY/7U0VQt60WHDSXR4R318YbjzSvG13vab3Z88T//GnN7T3IB9UPDbLXk/rvvSC4F8yolQuPHI7NhRC1i9seOVjviqCd7+Yz46TnueklpFkQ3e87aS6btgapI8VaisgXl0x/h1in54wXruSIUR2K1xTtB1J6TzEqC9oxTS/dmRN2DmFfICbQeyUXCLCi86inilFIr7FlKFATlGDO0EbNLzcE6WBTERiO/3RFiCUMgUYHy0znOxph7ixrACYiPHdpAMs+Ytg03tSZd57CpIZoYdwfGpqO9OWJGy/SrWyatSH48R95D98by7geDPs/RYSCRCbm3tLuBzHsYHPuhQ5DQYBlpSKockVjWTyVip0mERtkI0WhmSUbwMCYJm01DvpzTbD2RkHgv2W8d60crFo8qlJXMP7wkdAP5H61JX0N0Y9CJox56hIoJqWNwEc7GCG9wZUw4HLHOkwvPfF6QlymmHSkWgf4JyGu4+e2RYRzotSB7VLJIs5OVaRHh54p0ndK5lL5z5EPAHhJc71hf5ZSVJo8n6h96hsOEUZIov2JyPyb9zQcs7mMu/tlPWS0E5tuR+EWFilYUy5xovaT76oZ6EOg0YqbAGMk01my7mo3w3DUb9vsHXCwxG8e7OOFtFWiSCaTl7DAgjzHjYNjWW2rVI7YNUT6SX6xZuREvevb9xKAFWSXoxw3b7W/wQiLXJUXkiWtLY2cIEZEVgv7tHUFI1s+vsINi+9uP+Mj8OVPbcDx4JqHRJiPxgVBbRpMS2UCRz/A9hHqkvzd80L7hZ3/8L1nz79O3OXWekBcph9bgohnLF9dYHzj/+BnucE/8pGNYp8gnV5hY4dKK+UxjNw2ZDOhSUc4LOqvYPjTI4imzT/4EdVcz3Xli4fn6N69xekFZ5Ty5jtE4YtNjZynLMhClJ4CZsI5x8OSpoh8tqY/xQaCyU0jRy1OoXmmJik9qR+ffx/Dfs8X9/4d2yu/NSTxKipOk2L7XilmLs4K8zDDNwOQDUklUmqDTHicc4zjhs4KxmTBToJg0bR8hVCBfznC710x1Q3l5xSwv8Nua+WVJc2cJ+4R3bx+YypLZBzn4iWbTEBJHdr5AjhOxjOjLjKANapQcfrvF9C1nq4z06oKgl8xeXLK4ugQXc/n8EYtMM79c0L1rKS8WTIcB5Upmq4TIjExdi72p8R9GVFcNzmoSZ4gHz7vvbkgWEKbTGuV8lnH/zYbssmJ82FPIkvXLBX73FhGlqChHdVAywwwJzes3lPMEn2uUyyg+TBCZZxcbfJ3w6cePGWrLcGNwRrOtFeaXE0tZcvP9keXzBPkYIjOhfMR02zC6gIoTurpj9rhgaAXVXDH/sKI5TswWCaNQhCI9Ue1SWOk5Q9OiSdk+tMRenVAHqSZ4QXqeMbzpcPMMaSdUHuG/cYSmpvaC+q6nLDPmc83Q7YhKTxmNaDQRCRVrDn1HtztQzEqqKpBKy/aHlmqW4+qGNs64mmUsL1O2D4aLixyXRMjWI1PJvj9Fnu17i1N/19PeTczOY+qFQ8YCs3MndE8XiO1Et28YDz15HJAzwVmacHQW+kC8jtCJI9w3ICKoBZHomfqeTGvOnkQAiz8AACAASURBVKSY1mGMYTWbMQ6a4/cbZp9kRBdw+M4wWMF8pVDyZNc5tjVm1xPLAhEVVENGKj/l43/5H/L6u9cMDw9sRY8aM5rbe6p5zK45sJjPGHqLdwIrNFWwuFQhlEP0AT+0+AB+3yBczdM/u2T2DJzfoy5z/MWfIc+fcPj2W9CG80ph9SXyvuNNfSSdeuq7A/PVGY9WFXbUDCJhbFvaesSaiaAFPllg+o7ICNokRhiDlQphOsxhzzx9zN3Df8d//l9+QWb/nscmZmMSlheSYb8nLiNQCrPvcKkg0hPDUCNWBYd2yZv9L1m/eGDz4MjSBJzFa4M9DGx/qJGDZy0dKk/Y3isiMyNXgW3bIZxjVaS0Dx2ZSnAqIYpi0nlJN3SoEKh/eMvsyRnBbOm95PyTj3kyy7BKcuwWxFowxZL9DwfSqCKtcgynXXSNpKgyhp3DK4fyniTnZAsSoHV88rkGS6xTrLdIGU7zTAE+eMa2/p1O4r83RVwnxfsr80ka+n/BU7wjxPIkjdUnUFVcZDiriK1g2g7gLbOLmGkaSVYS2e0YohE5TGRXT0iN4ObdDpcV1IcRLzuG6RZCR+hqlJfkecHsIqV+2zMdemYvHzN1hxOk512MOIvRNaQvPwbxFLwnOp/jhhp/aDDhSCIGXv3mDW5V4seB4AK984hE4oaB9vKa6Tji7ES1ztlMEcV9h0oHhPHklwX5C01nHMOkeHSlef3mSJRHuLcde3FAjxmHbjp5Iw+OuCrws5Rms0GNFjM7Q06OtJAsziPe/Le/4erPLrj9+xZ7NSFHBU8zyntDPK948tkCFU/MnmlYg5xGpsFRBUvjYlJn8P7ky/RipMw0JCfreHaRIKNAWkSAoW8gV2uaWuEeJrw+AbNOQgNNXsTE85hx0JipJYlT2vK0ymWKnO6uJpl75vOSXCe47kBQECUlijkRNaMbWEnDKvIUxQpPS4fAqNP3SsxAcfWMs2LNMbnHEbMu1UmhVhvcuz19kbC+1Oi0xx4nwsMd6w8znAuQaBJ3Ct6ZTjJfJZydL7h/ZSiTEj0FkjgipJqjFFy8vID6iKMnSjQHpShyjcolPlNUzyuKWYHcRdidQWSS/S92RIVi8UHJ9u8fiI1DP1vTPQiqH895eLVj/eGSsN/T1R5nBE8endP2kM4EX/7NW7ad4NlFzVDt0dUjXNqzux1IfUzfjFTPljRTTSwCoXd0Ty9J3ZHspWD+seTZOqW8DpydZ+wbS/9v35DLmLyToBv2hw73z3/GrDkwpYJm8MheEE+G6XZPWqZM3xu2fUD6hOqyIokLorpjLHLSOCY9D/TvDM/PI4IXHO8PhCJwWazxScmw0/yb/+rfMFv8mifpRFkn6NkjYrElHRXHaEUaBM3untl6znEw2ONIPH9BcvkYmZzTfNkRSsNucyBNUvqN4SIXXGeBIVGIseTN6xoxTRSpIpOCdJFhwkR923P9JCOxgdF78tGghUCVCXc7Q/Uoxtc9SZTy6Mlz7t+2bE1Dv7XsjntM8GQ+Jl0lRLlGMzHsTm4DWWUkmWezHQk6IlYCVIwxJ3WedoIgTk4A90+ALPdeJuECKMHQ/AEV8c9//hefq7h8f4FwJwa080A4rRUGRRKFUwzcORJvibKBaTrFdkWq0INllCX5QrK7bbAuw80q+v1IdFEw7ieq0lFkluGwh3lJVKXIxRnj3Tt6IWnvBs7PKsZlSb+bUN5jtKU8m3H49Q/YqCSOKvJZTLkIJMpSaVDKkZ+tuHx0ztnVmvu//5bkTKM6T1osaUaHD5JS70jLB7InOX074fuaF49jlmdzwlqx+X5kfN0wv1oyW8YorakfepJEMo2evJoxBssQPN4FrMuYgmCWCZRz2DbBeU26NqS3E81CMuiE2cczYjR8IDj8TYebB7JVic8q6tsd99+1JGmBmmD6J2WUEoTas7sZuHi2whjLxfWc+3cD8iKGQrMQgu3XR9KxJYQZUZmBVCyqjLF3REoxtY5ZFtFZd/qzxgLXe6J1Tuo85vt7TFES9wOx0HA7UEaaTdPy/NlL2ngiHGrM0KNry3mZcPtDR5+BUgNbH5MJi+8tmdbUY8ajYsk3TATv2N1ZJgHdZsIqifp4RXY38e6bmqgLxFcpQ5FhpojldYk5jIzHBm81YnCE0TA0ER+eS4ydsKOhHiXGBuzkGNqAwaNcIIiCUkQ0tUfK97fjh4n7TU0UDG5dkq010TKnayeGg2XxH5+hY03990dKASqT+Ls9UX6yyk9HT+hzFlHG969e8cO//RVf3v4v+EVPWQ8YEXE4BlRp2W8d2TIlSWIi64lDoBlhqCq0Hlj8WLHrFYdDzNc3E7xy/PqrHXMs8ScrnBDki4RDNDGGe+RDR20G1nOF7COMSPHvbmjNSKUqPvxoyepZhD2X/MNf/8D6fI6UPZOWzC8ec/vlntlMYRNFM/W4NhB/8JThtqN5+C3FqufJ2RPirOPu65ZxucYt2hPuoFcszxSjnHC7HqcTIhKEsJSzms0XXzPYI+WfFqz9gmbQDPVIsBbvUrpMkBhFqBuen6ekhWccjlgJ9U4SASKTlMbxYBLKhWI6GGyVYkIEZiAOim5ouLi8Zu8atp3E+QFtHcIMjFPDpCOqMgcs27cHVCxBxGRFgRUeMXjQEjR4pzCcZn6SkwZu8h7lHZE6oRgIJwa+RND9jj3x34si/vO//MvPQ5oRnCfSESJorAoE51FC4ydPpg2jPVnHTSxAWGbFHDUFBJ5hVETPlgxffYUn5eKTD1FKcJZr7r/5kqeLglEK1HLJ+Ycfcb1+Tp4vWKyuKH7ylPb1HcEN+MUc31oeXa2po4RlFuGjlGqeorIYE8Ns4RhtB3KiVSOzecIirkCOKC3JlxVistgycPh2i9eSUQaq+IxN7QnigcsPZ5SXA/urA4M0DLWjVJLVx9ccXh95GGEWSrq2ZmgHZJXR39XookBmmlW15PJHK8Sq4vDFjrujIXu0In8hGLYN1igWH11y+MUtOgQ2/8Mt849n5EHB1Zq8M/RdS+gVH/7pR4xfTfj3L8tEOOqxQwfJ4nFJJxSRdthvLfmz6sQrURLz1zdcnK+o5juasUQNhnwd8e4XG4pHJbFUPFrlNO1AU48UlWYKKVYI5lFGGFvGQXB5EXNzb5ld5ySHiU2iWP50zYE74nrCDguyJCZdzLl7dYT7HhtHRDagM4FmRfAHdKzRU0WePeUXeK5lTZxCaj2i7ynOS+SmIX1cIPYNh1gTd55uq7i4yrCdw+17jIipXEAuYyKZQLfjGN+TL3LSf76ktJLmvsE8KNJrx3jXo5KE1kyIeUKedpiDxnpLeaE5i+f0x5bqUhL2B6K0JK1iFo8LfvirO+gG6mczynKBn0uqdsK2O6zxMAjmoYf5kiKOUKLh7cFy/nygePSEYVwgmlv6Q89qntEpKKyhGxTrKmHqJ4zcsr4KhLuGcZehdEp8p7kPl3zw5Ix6iqk3BrxgfzPx9s2W9WZk/eFzuiql0COmy9CDpTx4ZneQfvaIN0fFza1j3PREJQxuICaQIdl2I7NIk68V7U2PMkeEj0gvDUcfobMl7d2Ou/s7Nvc7UBHLRyv8cY+7r1l/PKcrCoyP0aImzBIGBwjNGM0JWjFfxsSHiKPNufrwx/z0k5gX5wuytESbmA8+vuSH+3ccjWVxvsC5gE8UibWkK8P2zZb07BwzNNjWMX+U8+o3DUkB+1GQakfTd/Sm4Cgs4v4N/rAjjmc8GhXtdGqZKamIvea4PeJnimnQPH4u2T6MlLF6L1kJCOGZzVb04cQ5jyN9cnAqgfEBUKfBr7eE4On/kNopP//5X34epTN0LLHG4JxDo1DBI7ygWpQQFXTHhoDHhAnTeayXuETSTSfqmio8oxxJ1k+pD7cMP/xAH0aK2YqQRVx99ozGKo5M2L6jPYwgJ5Kjpyxzji7Dtj06jjl8/ZaLS8l2qxmHPfX3PU4qlueGbjOwezNw/VIQHVu0r9kcHnBDg51GwgjLsxmmh0EEqrOC/jdb4g/KU5G8jrF3JUIuefKzP+F4O6FvDULl5Gcx219uqZKc1brimAqm3el6ll2kjDLmrCjY9Y79rUCMLZu7nvIsZ/mh5vV//4rZv3hKXHv8Y48iQryouPm7A+c/m2FbTTh6FDWTz1mtNMcfBsy5RxhPvAc1j062GGXxfSBWmuAEEwOojCzS5OPIpb/DHyq+e7VG2R3iHLRXhEiT6wPkEbWxhDQjWhW448gwdSRxRT/C1BmMAKdSlgvJFBKO9yPxecr15QPJ7o6xu6G1Z/Q6wqqeeD0jeXQFxRKbJkyd42gHlumSuajYe0GIE55xZIkjEwMm0SRphps8UZTiZYxTA5eLjPowsCgd01lK+1UNecTl8zXDwwjbAV3skL7B325xh3u6f7ilrg7oxTmqH6hFSpalaBNQzxK0AeQc4wRRiCjXQGrILk844z62BG8wh0C+EOTXGbsv73n06ZzXv6yRaAQNPhgmnSN1St0AKqIEsi5Qfpxz8aLE6TXGPtCogWIx59gPLFrL/EwzmITeg2/vTlakJuE7+Sl2eEFmBqg7Rr8jyiSH1w+EPmHUKUOf89HFiriacTO0ZEnP2TRD+oTp0HA9KB4dHc3C0H35HY8/uuSXX/47uvoBe4R5XDA8dDwbOi73MbZWPD56uiJjPR3ZE7O6mHPcb0/UO2tRL84pfYywNc008NBVrH/yjDfbhu1XN4yPFycz1MOe+flJZ9fd3ZFdLxgTz5NHA8P2C6ZfDRxfv2XzwzuKYMn3ex58TrIWuFwSzy+oXx8o1xmRyjnaiSKLWBSCqfaIVDOvHOFRSY5h2xmyvOJ8saDdDyTHHcnlxySHlmdKschKju2IN3uKeYlXmubhDdfPrrBTYDiOlGWGDQEzORJt6b2FbjwRKp3ABYFSgJCIIN6zyE947j+sIv4Xf/m5Lud88vwZ9/d3J7muEGgVYV1gqlu8P/VVpdR4C3GSkaUnY4wUgUSc/HRmO6BrwfJZBVVOVVt8ds5wkXDcjBy7DnM80u4PxKYjDj32eKR8rKh8xX6c8D3wL15yfjyyq8AdJiJtcZU49QLbLY8+njHefM+m358cg9GIbTqOe4dwGYukYnFxQX9zYDQDXGXoVKEjRdRHNCFBvFvT/OLA1ULihoTsSUmz80z7ERXFiCTi8NCQFIagUqZjS9xr7r6+pVgtmS+h2R/J1xnpOkJOjnCdE79r6DpPriQ3X90TOs+T/+gDhE2pRwmHifJFyrf/2470jxLyJyBbA6sCHLB1qMcRMrZk84R2PyGVJo8V06HBVoJ8OSM2K4Z8znaxpLxMyWcZe39ycSYhkJURvbSM+xHtQF/GhNbQ1xYXTjH/0PXMZjGcVUzBMBOO85/MSY63qPkaXcRMsqL5H1+hQ0Cdp4y2BSUx48igY9KjwSeStYwZbxtcOeLFcHKGCo+SE1GUkgqHdx1+MlS5ZruzJD+6ousHcguyLOB5xu7bO7pOI3QLjycehCN68pjusxnRT0vav3OYXhHOIqoswonAbndENyl9PRA5g8o6rj8+tbfqXqJDgp0mhH7/m0UVt9+05FcezhbsfrEn6uDquoRcnBAH71pWjwqabw+4tsFl0IkjZTrh1Yyn5wvetO/IBoG3musnJd980TJfpYg8IRpapmZEWujbnMfPP6b+bc0y3fH8P3uB/2bPWAfGSHN2XpCMe/S5xdgRywY9l1RtTKI0Ko04vL6lTgumUiGrkmPX8Y/f/AKHRbPi4uoRqu0RQ8NgJxYBgplYYzHOoy5XNOMpdFecXTDc70FGaONoUYiguXj5lCgWbI9HIiUpr1d0P9zRHxpMIlBMpF3P4vFzXOlZnGumv37NyqQkecXdWUXiR7b9wP7Y8FHVYg9zRiPZ7TbkK83eWvIopX69xfY1ssqZFQXj7YFoFZOolHqYiJXHdIHUWaxrKdQl835guvwR1Yf/CXt+yV09oL1HZQXbtiWEgvVlcfK3D45+cEzKIqUg1QrrDDKKwQ8oyclFisbbgJAncTr+FPzp2+MfThH/1z//y89lVPD25u7krZMQlAKtkVJipQTnEfpkGyviBKcswQWqJGIcIEtjjvsN0muK6xXt198w2IGujbFsmCqDaTULHRgaT+I90xHk/hbigHYZyTxinxvC/RG++S2hVTz77Bmbm+7kBxSC86QjLDyh/5YuMywSj9aeiw8+JAoR3HYYDanzZGFFWgVebw6kiSTPIrLO0ZoM1R1RqwVvdgE5fU0UFdw9jEROcfvdgef/7CPefrtBpOC3PS54ghFMwZBdFsSd4/5mi2tjpAZ/IWm/q1k9mtFsJi4/PeN4GMg/ekwlPW0eCL/acNjXVI8CuTPEf1SSDxHToHCJwB8c/fGkYkt2gnAeMXWO4jwlHAOtUQiVEsYOe55Q/7s9N1NHdJFQ+JHtJynhf90jfpozfnGLejZDDBrTDnRKIF1H8ekZ3N8QqMicQEyeUXt8GGGtWQwOWzmS3vLrv/NUD4ZWxCQfv6A/h6K+IalyNocTRzogkUGxP4xEVUyfajiOpMOEH4+MactEShAjKnh0Cal0SCfZTyPaOSI7EaRjPLZgYf+moyoz9MtAu3FYzmj/KoNiw/lhx37IiHVGKTT7yWBftSQXMxJh0S8q2G5ILi4xOwl1S98p+jTgmwiVO3Ti6XXGYj1HJAa72+OeZFyenzM2NWqpqXvLcpHRpjltDdG6RJmOxz+boRcFItV8+fpb9CFFO9jLlMNG88Ena0Kw7L47UD1OaB8GWmdY/OglX3//t3z4U8Ov/6dfcfvbHVV6wbbeschT7P9J3ZvE2rKdh3nfWqv6ZvfN6W7fvIaPfBQfG1GSHUlh4tBAHCODIAGMeBDAg2SQaUaBAU2UaQYJkiBGHARxBkYAG4aCIJJsKbYkmqRMPvI9vuZ2595z7jm737v6fmVwrixGA5s0AkNZwMau+lFA1WDvv6rW+v/v6woqrXGyBtwbwFyXuZiGoqFHtYupREf8OiYaOcRSEd6ekKc2RmtRSIuJyrlnVBzqKWnXUAKohoNoKOYO8WXE3u8YzD32yxtbDl2H8gRd1WIlLaWuMAxFZgTgCJokxQ1N8o2mkxG68Ri7DtZUEh0S8idrJo6NWGjieEWzs1jomqTTMAlY7Spu6Q5puSjLxj0Oaa53NFaAcmrcykTPArKiwHUcsqyicizyqKbWCsMW+L5LttviVBqnkIwnLdcfGzS9guTwAt8Jsd9wgvzjMaFjUsQV8W6P1fNvyIVvRDAKA51pOqkxfRet9U3LfdehTLhRxt/ggvPkZ0vi/9I6cSHE3xJCLIUQP/6p2EgI8X8JIT5/8z18ExdCiP9aCPFECPGhEOIrP1sa19RaYlkW0hAgjRsL+xtlminljfBHSnzDoKPDNC2aVhJtJbaGbZegdYP71i1aU1MenWHem+C6khMLxE7TDz+g7825rSyGe8nxNEDgkucWSVrTNC4TZaCHHtz+GoVbEBkZgWsxFAYogyzJ8VWJ9xXofhkqBc1RyOvLZ+z3Ge7bU6JVxUWu2FQJTr/H7HiIqyX1vqSoJWrcu2F0qBor+Qn7a5fVeUTPd2kqjVMV5FGB5VmQ5FihjaFMlOXimhIVd1zt92CHDB4FHN0xUVWH0/cRVoXbH7Aodwx6mvzlks0ypZ8IvDsOj9/r0+s0u9cl1fWaZpvgFC3d6wqhE/yZwP2CjxqG1MsGqVu6EoRlEwgD68in50lu3XWxPlA8/kZL/zRnu95jqwjnvoUKNZ7siEWNI3Z0MwtzVyF6AZlRoHKFkVdURUFe5zc29rmDYkdhLNFVTeTZuPuYHzw32GYVu+UF6uUFTZOxOGSIMqMuMsp9gqxLPMNCobGMBuV6iEqiqxCNZsCCEIVpmrRiTGNPwTc4PgqY+AKr51HvDIyej5c32NIiTnbsbR8/6GFd2jx7768Q/9p/QN4F9O7PEdOQQwHz4Zjx4xE9B+TUxNjmGPMZLFKiZcL1YoC9aeGTFU2TU792EFuToNpTLgo65TF664xBWrM6X7GqK3Zxx+37J9BBfJFw9sUhnaPIS3jynR0v/3hDvm/I1y3Kspi+f8pUKIZGzn61QDgu3sim2zfkuqPLTMys470TG32ywPjiEfbYo/M7XN/CcgW1VLiuiQgUtuw4pAlKpExCG52V1EVJ1Uou65rlKmJ1mXP9es/pcZ/+wMQLNU4dEvtjnDMD735IoUo2XsGnQUNhalygP55gxyWzWy4UFbQCOwwYj2d0vZCy0ehGI8otvt1HK588TqCn6PYZwtA4ZwPK5Z7+doVtRwxGDue6x5UaUvgl3t1jpvMJiW0yOJnwTFkEdofRpVRXKZblgKvwTYvGkeSLnDbOsJOa0HExoo7+1Me0JbZSSFHgBj2qskSbgvJlzEr8LmV2jiVcHjw4oZAKw7cxlQllw+oqpjcKQbcoBKZxo4LspEXnGBha0VUdVd2htUYa+gaC1d0AsH6e8bM0+/xPwL/zZ2L/BfA7WutHwO+82Qf4NvDozedvAP/tz3QV4oY1opVGCwMpFU2roWowtcZuUjpZYZjdDQTJstCpR5t6ZNmWpje6uYu14JcwKgqMYkOQl9SuonQDXPMEOzNIFw5l94B7X/yAynGQ/oC0qNFdg1e1WKM7N56+159Sdh4iMSnalk1e4Z0onLpjYNrIIzh82yd79GX2/2ef9R9UxD8qyL+7Q2sbp23YpBWl4VNsInZdyT4vKUPFpG2otwXpD35AtXyNllu00jinNt3FktFbp2w/3iF7gqrUiMajbTvyuqYufExTc/toyq1jF0tqMkNSb2rU1GWf1JRGSf5ZjRn0eTCdM3//FlIY5KuK5FJglUOCO0M4z2hqgdEzGEw6VN1SeR5qlVMhIJLg+OjKpRINqWgQRsE6l7zeGOwPE1bFjP3tnFqmhDrlcF7Rawv2mcPM7JMPe/QyRT21GQxdxl1HafjI2qWzS7ymJa8aYq9Flxu6VUxV1eS7Ed57xwy+Pscb+di2SU8o8tpAKgvPKZHbBJ2WpK1DVSt0WaGFRHomtR/QdIJ+MsKLOjbbgqo9ZsQI3+wjPQ8jjJFZhGcb9B/0UQY0WmLfCrFshxMnxDf7OJbE+t2/S/IPcprLIf6shyo0sm5vGkUME0ILkbW4no21rmljhUgtZGVy3o6xpxPkIqIKLHQXIDxBENSoNKG8THA8h8EvDPGUxi5KnjzbULQCbxRQXbf0FWzbEmnb2EcTzp82KHlGm4TsvrvG8ioG/Q5DWASyoI4rInEjLZjc7mPYJa8uSup/tMXzQ+xDQVBXhH0Hc2BjGDdGolx3pK4iTGvCnkORGYhQU3QVqkyZ3GqpVULOmqJO6T9+G9H08bdw1BtQexPaYMasFzD8xmOq3piBI3HRHP/iA7LWxhhMb5RrjguhQOcN+9VraCtM18afjrHxSJPXtOWKwdEZg4GLMZ+gMFg93yGMmi6rMSv4g48LanwyY8h41mfqz3j8rV9hKOa8XMaEhqC4NKA3pOsMnFDgZlAdIKRhZmqMyZi8Z1EeCjxVYe67mwX8uuTqxZYqjrEGklf1in1zDbqk2mwYnJ3wvWcrokLT+iba1iy3DcPAprN9zLZDWc2Nqs7QSFWgpKSW1k2ZtFY3zCYhoGtv2IVS8P8pO0Vr/fvA9s+E/z3gb7/Z/tvAX/2p+P+sb8YfAQMhxPG/7BwCgSkFSoBsO3RzI3BFCRpD0tomwhA0TUtSJRxe78jTCMuIUYGB6PbUlwtE6GOUMS9XEY1sEELRJDXLyx13Q4tm8WPKROFsTC5+lOFiYJ4+hP6ctVbs25yw3cGXTjAeD3GijPh5TtMaNB3InUdujIkOFfX+jEfzD6jWU5LSoqEhImKXe8jqQHFYoIsSX6V0Mqc5aGzbgQJCV+DN+9Rnb2O/8wDzbMbw3QH7335GNwqZPXRpSUgvlwSuRqiG2WzGt7/1K4xOa/yhi+lK8rKl7jR13DIb9dCqJf48wfUMpreHiAzWVUvy6Y7dck1wdMbw+BbWJCD5NMce2RRlii4aosalLMF61lC/7DB7AAZdqiGQ2LnEtjqgRT5sEHczth+M2C8aArOgbxaIz25+qGEBw6nAykqMS+jahuncwfR6OGsHR82JA4cmt0j9EG7taa4+Y9uvaWtFLQVdkfPyoxSjgvqyZPNixdpoiVRNXUW0ZUMs4eBpmsWC2qhuoGlZR5VEpNpB+z2qYMBi2WfsnxCtC6o6JW92lMzwDYERBHhtTc/zUbXCueNRhy7KszDFDROj/+6A6VsHxsVvo96/xeBzTSAdqmWHDcihRegqlCHYXR/Y7zLyqMA88nCnt+mffhNVe3ijCeYupYsTVBVgOhmZV7B74NEkfeyNjXH2DsmFx/53X9N6ivHURt+1MScj3v/1t5mc9ZGGz5d+acZXPxgjuw3miYcV9tlsK1TYEK9j2rpB70HbFum+oWxr+u6IXXWGs14gjAEqOCaKNEWU0BY1Qiosy2AemthvTak7jT+yqNMW3w1ASHRk0awPzMyagbDZfPwML4ThvTnNAwfVK5BlwsV1xXJTIFybsj9Cmw7FQXOXiqjJKOsao6oJHQ9TGrjWAHeoqGvNcntFbpV4wkIqk+X1K9av1yAEsztz0ijCUQb52Gf47gzsESEu3P5lnpS3eXL1nI8+PCf4i3fpn5qInkvGmiLpGIWauKjhDPy+xD8dELcSt45JyoxDZ9GkW0y9Y6JLQr/h7NYE6ffZ7DOGvSFGz2Qa1tTDKaICRIYoGuzawVxnKKsCS2HUOYFUqM6lqRWtvgFhCZqbNntVoQyNkPoNeNbghhwl4V+D7X6utb56s30NzN9snwKvfuq4izexK/7MEEL8DW6e1hFSocsOpVs62jdOOpCW+lPJcNtiGhaONODERLaSWicMjhrKOMGfT+hah8q1OH37hErALtHYo2uqrc/LJxcc2X22h885xwYewtUpmAsYjuDVKuTTTgAAIABJREFUJ1Sehew6+lqxaCfM7lXYM5fksx2mFui+pNpbSHVC9R2TfGXgfPQM4+unxD+85v4opG5hsdSkTcPX35/yBz+4QBpzXD9BaoPw1phUSIaOQyp32GWHRYKTRPi/2nJ4ecT+qmV8u0eaJiSLBH9q4Xs5i+/9XdJZiPzcofegh3LAVTVdHWAed6gCxidj4quI/ql/U3WSVWRVy/jkmO2rA3G3xgk9wjses/t3UXFHtiwQow31UUfg7TGrPuV1SiIM7FwQDIAelLVL1UruFzHF9685Ls+Q2Z7jYshhfMTAU5TzDGOR4s17tFgE0x6d2+ClgsU/WWMcTTB6HW3d4IUCOawQdomXbSlXd1gap1gHwbht0dOO5T7FNDT9Y4+8LZhEAm0ZFIcNmyuPUa8gr22GRsnT1zn9gce4H5BfZ4i5j0JTjAaE9jH6qOT6tz/H/vUzXF4ywCO1JUmjSdOM/pd6HP5ox8D3aWwXaYb4xzX5955y78Sjc12yZ9fEhokoJKfvO+ysjjC0aa4FqiipNzbeKMDtFTRygzUpMLdLQvPAbu+QU+JJTda5CH/AaPkh8XmMP3cR0wHXFxnyxGXw4BHLP76ka66JTn2ULRnLE9aZwWxs0sYGWyG4/8171FFGvI44uePTJBXLXYnr2xRVB02L3XdwKxM18lBVy3YXs6ljzMMF4xOBOlRkriaOC5xMcpAZHg3NdEK6rPDCG1VaW5pUm4S7d+YYWqFtxcDq2EcF55FgkzeEQUghWqrpLbbrc2SbYnsBm7KEasfx3GaXtoimwXb0zQJ11eH1HZIoxvM0RW1jF7CXCQ4lo76Bd9Kn2moe2C2RWyOKnCbWRJ8pxqXJgju8+1f/Gz7+775OMA3ZP/kMs5tzeNliHVW89cDnw92Bydxh85MS9yRkeX7B6djG7EquDvDoaEjuK3I5IYsqHAI80TA+W1A5HkIckeocpUJUEzK1bVrHYureIrQris0BkXW4ox50LaqFVdVCaKDKFomNqQWlrm8SdWe8YYh3iDf4WcHNNLL41+nY1DcTOD+3401r/d9rrb+qtf4qQtEZkk4ZCMNEiBuSl24FqgNRCzzXxTJMDlFCHkUkXYR7YrJ53bJ7eSAvG2zVkVYtl2vJKtvgV2us9IZCEJQdu4MCDkgE3xz/JbjzdagkQWoBknpZIVqP+uIArxawPdC1CrYtQtxUQYgzl6u6oXUrVh8uEdMAuV4wHoc832aMj2rcmWT8wftcFy3d1ZrQqjGqCvo9ukWFVDmOrxmJBMtNcPo5/luvufyaAacdTVdTKyjrEu/OADlzEP2KXSEZtA1VHdPIBsMwyQ4N3m3J1XWJSBVYkiKQDIdD6tynKH0McUag+gxkSJPlZLsCw/bobSyKoqGqOsofrDHrDLUVWEohypau1FhJC6lGC4eyLmi2ObFzQhS5KKvA/+qULBmjjwWXvmD46Jh98BgGJwzNKd02on2d0EY18/mIGoMyEVhFiqk0bZFjn+/IDhWiGFG8LFFFiapq+kc2ykpJvvucdP8akWdoUXLIEurOxgslatLHnVjkosOfONAIykVK1Za4QrHf5zjKJCbFiq55+K17DGVJl+U0eHj+ALvv4CgPXnR4I4fJV6e0Ex//2Z6+Nujuesh7JYEDt96e0rUNW5mRRR2mC7NBn64xaDLFycMQZm9jn3yZO+/0ef+/bMi+9gJtuqAMtDBQZUODSSoFRi9g9jggjxxW5PR7Oc5DgVQlR4/nZPaQgdvH/Dglffqc4QSiaIeVLRi4Wyq7RV9vyb0K09RI0eHYCseRtCbUQtMdSqBG14LJg9v4b93m3W8c0wSK7es11XhKWxgEIw/jSOH0aw5xha0l2IJFqcnjnMM6w3M080BwiFPKRcRhsWexKSlETGs4XKsvYj/8Ve58bQI9g26fkScRVZGzPSRslzEFDYllocoaI9pRbxLqTcwoOMPuzRjf8ZHDGr/IyVuXUhmUhc0XT3021wmjWyOSxGbwjsniUYs5mEHwYz7+zRnsrkg++94N4vX1FuqCR/P7vHhVMgxC4r0JY5usOPD4rRHy1hg5HWO5PZ7nNk8/P8eih22MuFhe83pT8fyjHv5aMziuYBYwqgVJviXebIl3a247C9JDhjYMZL9P3gla0bKpSzJPYDQtDiVto6lKiXRMEDVKaqQp0dJEaxC6Q3ctyA5N+zPn0n/VJL74k2mSN9/LN/FL4NZPHXf2JvYvHDf25w6tNJ0WtEh4g2Rsu5tY18g3VEOJch26SiNyB+9kiggCurxl31gMQo/TQOCmPXTVEcclVDHCDBFqSY5NxxP+cPOf4z75PQzxDLO9RpoHakcicx86C+6f0JxNYAC4kqrWmLZD/tEKswW7rDj5gs9wFDAc+oTjCY++fMLFuuKwLJgEAZbd0Z2d0ghJYXjcHQu0kZLtcopDwatDRTq1ebGPWTUDTI6Z+AH9oUe2bRANWHj4iUDmHonqs/hJQdYotEp58fsvWK005x9G6IXm6o82nDgNY8Mi+WRPlexxShvh9lilE/Kmpo5bsCXtsmHZxuwuDqRJjZABYeXS9A0qocgMha4a7BOPIsvopMDQLr2HI6pVwXh0ylC6VM8y0u9vUS9S3EVJ/vkW8SqmepZwuYlxpwN8xyWuwT6yyFVB71bD47dssrLFFAK5NgiO7pI/SVF9E0yFYUuaKCZdZYx/YYIx9KkOHasIyqsdWZxziLccFjE7R0AF2fWWeL0ltQWNKdAkSE/SdApdHzjqDXmx3hDQMPJMGmykGKLyEjny6T0e0FYS89kKs+hojvo4PRt35zNperhuzfYqQvZDHr51i86VJE8qTFqE2SFKRVzWlEGfrV7x5OOIf/ibVwTmkL106eqc0jbILUG5LJATQVm7rF9IPju+w/X+iPBFxKhKGa01t2chZx/cwVUh+u4xVW4QfZghHclVs2ajDxhRytE3prw77nM0DJFScXx/zPKqoCctHA3hzCMvWxy3YXe14OX3X7D87MCxMHCwcM0Sc7PHEDV2LfBcG9/1MJBkJfR7DmHfQ5sK6/iUpxcRRt7Qx0BHMXFb0KUdUvYokiXn/8f/zmd/659y1+oxvD8jlJI4zdm2OakhCAOfoWHQ+iGNsJgPhvhITvWW4tmG8z88J/s0whtMedjzqH6ccafMcAJF7dnM+i6HxkW/bDCPTa6rf4T0d8CKm5d/zdtixSrbcO/sA4zP1lDZjLwRr56/QHkO+vKa+GJB/eIF28/WzEYCp98xGw9wVE1aZzinAfVwSBMbXHQ2dWziJymlaTA4OWM4HfLwbMSmnRMMR4hwRCE7bNdin2UUlsfAC+ikomgEhnPzcCY6hTAs6lbQtDcmqT8VJIs3Cf1nT8b/qkn87wN//c32Xwf+3k/F/+M3VSq/CBx+atrlXzhMNKJr0bKjlZpOKyxlYaqbFvy2FVApzBaqXQLNTQ3mUAv0oYSyxbAy9mZD3a8hMCi1jUNLzxhzfCrYtT45K3hzl8u736Ppdti3fLoauvEE1TV4w5Dg/DXLH59zeJVAeQBH4fd8TgfHGIVJQkC/ctnsoEahdc4u3pI3Nc6dGbt0w6quufVgQhynmBbsMiiKFlF27EeS46/eZ3D7mOFoiLp1hDjv4wxt8kpSNTVamYhoh3BvDPJB3+Her95i9sGEfuty/yunnLkFTtoQThWNTvj0aYa8dcbFpsAsBKHWFG3CcrfhxecBz9oTnp/XLJ8vWF8tSJ2ctNBEaDrVkXUVRhxhtBnOWBFd55jSJRg7aCGpXuYoc0BzMKmdAd3DCdN7Y8Jbjwh7t1EPTun9+gMevjPBmktWhqYaOvQnJrEnGE9dsDSrvMNxfZrMoGgGRIsOYZikTYZQGaUP21VOlSzZffiS9JOE3WLJOF9xfH/Edd0R9U9IVpLX31mzvIyhbthHAicQtE3Edr2ibQXW0CBoYFfHuEHOpsywCPHKPrrRHA1H6GfX8DxC3elhnPUYvT9m9yyiKiz6741RrU2amIwnUwbDAdKQDEdDWtWQao0c9rHuBsjQwAxeQM/nyeKYaPhLRC8q7DYnMyVuniADH1d2mEUKL1P6ZsK0/gTrQrC5P+L1+RXerznstwuMs4J+sOK99xoG8xq7ryi+v8RcX9FuUwaypa4khWgpNw0+Nd3zDWNTUXs1tm8QRxkouN6XTO/2eecvPqYLWn74vScE93zWn6cMHg4w847B0Oew0WwvXoOncTyLqetw+WrB6f0ZE7WmEw1JUjMb2Kw6E1M50Ib07vtw+X8Dn5DyMS/2TxjduYs2bagEfVNQqAKSlEy2lNLA7Lvsu4ZYmGzbmqC1+NLxY97+8j3ifcI42lEZLoF3G1ofkRXsdyVKV+yvJCNrivmVR9z3zP9XPnmZSoi3xBcJcSu4BA5ktHg4j3oYj+eE9+4ibs1p7QJea7InL1luXlPGLTqNcPKcflxQ9UwKVSGLLTspWIYuwTHETsbLuOH1Oma523BYrMl3KUm0Iw+HKNXQFAV10WGPJ4jCxNENVDmiaTDpkF2DbUo0N53nQnfIP1no/BnHz1Ji+HeAPwTeEkJcCCH+E+A3gX9LCPE58K03+wC/BTwDngD/A/Cf/iwXIYBGglaCVt3Albquou1a2q7Bsg2k0HRSgA1WYOCEJuO+RRw14Pko36WJFCppKA4Kua4olykFIwZiy+fPn3H0hQ4IwQz++bln84dc//gaE486STj/fI3zjSl5T2GefpXwzikMDPpzn0FvSGVKuo3B4UXG5tWS/cUFyWpPksboKIfKQcUFA9cnyCu6tmIynzGaDrGJ8eoWYXT0qg758grxx0+ZTUyCz64Z7RK8oEfhW/hjDy0lSdyyWcVUUUJXp3RXDfunB7Iu5mpXoU8c9FAymfgMHk3RcUH30RW+0oxtl2iXo4oC5wcXzOMv8813/wqpOCH+0Y5yt4d1iaG2WF4JwzF+arOJbl7m7Lwh8A3SXcPlD9bk8QGalq5p8XwL/1Az3Fk3CTgvkI3EjmaMzyuWjwWNWzJ9KAh8hXNs0cYZVdtiyg6jAEs22I1DJwY4uxHzxYL4x5cYGDRZhlNVuB5EA4vhPRMROhzcgKvzHXbWkf3jp3TvOBz9+im6aHmVZbTNhnQbUzUdnWFRdTfdibULZZTjOx4jcSOubmKNZUgaMobvjNj3xswnAe3vv2ZSFvQfO9SeQr/KCe/OmJ3NGfZscFqarMI0O8SkR1XDOJRYAwMjb9AXGx7Metz+5Tn+rsSrA6L4JmkHZkiIgbntsC8lqjXBs3nUXjK2Lqj2PmaXcDB2JKMDadWwO5gcNhPKfof3iy6jbz7Ef/gNTrIZP/7tj3jx937M6js70t4QRZ/IH1GZBmIJqtK4nk1rmsxOB7x+umH1R08JZjZ3//JbZI0imLnYyqZ2XF7FKS0K+UszuvKA5WmitME2AtRhRZT26faCdx/M2F7GTJwa4c8RloGIdswezv70j73vo6IdSZZhyoaBa9M0LoEuyHMYqZqqNaialiKP8QNJHRqsk5TXTy4IVcAqLhlar1jsn/DR02te7GtK1+Dkyz6RnbP4rSXzRGMOfE6xGbxZ5lOuwwdnH+CEB5ZWwwdfn5NeXzG816MuNOt//JTPG8niwmF4e455YnL2lS9wdvY2jObkmYXnKAzXRdcRvZ6gaw2O5rdp8ort1TXbtKNIY26pgv4+xqwS+m1NVwimTUvYdph5Q2jkiK6lqm44O61UCKAVAqluYH+GUrTtG7nPz+HXhJ+tOuU/0lofa61NrfWZ1vp/1FpvtNb/ptb6kdb6W1rr7Ztjtdb6P9NaP9Baf1Fr/b2f5SKEBtHckONUJ3E6iS0kXdvhWYIyq6jaGkQHlqLrWug0r16sMETD8aMj2rRBdCXNKkHvdvQmNqap8I8CXpoBzG22y4iz8YD3H94GDB7ffkR7/Rr8iDpQ9O8dUf3SA67/wUdYQcDJLZ8mjyAq6PV77CyYzQ28X5rT64VcPylhV1ImEYYZYk/f5vb9+5hhyDo6MBg4yLrFUy1RkZI3Jk1QIZyS/apkn3ks9j3W1yZPXrXkYs1mfc7RKKRtNfmu4/b7ZwQDD/PeAPtXbhHNfM5Oevj9ARNPY816TA1YXabI84xBGNB75NEf2Ph3Qnr3QTURnfUxny7+K/L1/4Jl/YD738ywSodx4LHLSgbzR+TnGVlu0cUNZWZQGgrT9enphPh6h1QuVXbAHTpEbUcz8qhVi/BrlumBtFqS6ue8sHYYcY4tOtJVxZPffckhSulsiTGCwaiPNwpRaHxbYhgCz2w5xDOCk2OaJ1ua/TXpIGL/OuJkXJCcAUNIhInw+hRPtqiZzfYzePW/vuCfXZsEB4t0FSOaCuXBdrUl2+5ZRRkCRZUnBEgOWYLe5RiTMQscdjgMyBiHBiY1vb9wl2g2wLlWuC+32EcBVQGt2ZI1BX0nwO3ZaF/z8KRH07XUpmLg+swHBUaUc/ndH+JcXnPbM4lfpphJxPzdOzT7mP0hx7s3QkaC9clDioPH9Y/6eIMt3sVT6vQY8Q8z2jrB+fi7eKc5tiU5HU45G9b0UgPxQvHyY4WRntKpPrtNS3EVMeofg9vijR32dUUsbUy7h5ULFp89RzkZWWhQrTqMlUGobAJHsrhec32Z0QYu+1OXfCfZ9EM2TzMcX3N6yyecjNFdgnRs8vUBTgLufe0rfPFrc8IeLJfXVJnF6em/z1/7D/8aARmvnj7n0eSYb7z7kLgJIEqIS8Xt0GOxSglCTd9qGc9Dnm2h6BLG0xtDfJJnaAbcESN+8GKDcVxiHVv0ju9z9czm8eldRk7DxadPaC2HNnzMHkUvPGV065TvX7wgdS0ezxs++eTihtKZlgw/XfPYOuKedujHF7StTy3G5OdrqvVL8nSBbRS0ho/KMxwrRIkeSgZcPt1h2JpRC4/cjpNKY+yPKcd3OX37bWa3+vQ6k26zI3RtWhsaQ6KyGOHctNprfdO0qDU0jaBp3jRp8mYq5edK4X9OpBAaAe1N+6muNAiDRtz478pGY7gWjucgAUdaNGVFXdT0Q4PNVU1+qPCPp+hC4Jg+wvKIifDfHZGuNWQpBhlBFJPrS/T+BfPHE/L9NZsmY2432FVH8ZOXRB/+kLf+3a+S73LOf/KC6MmGwcmQwyZn3lwQHce42XOGDwO++O33CadTeke3sQ0fp2ewrzX3/41fYf+DV7ReRycyctfCiGt6py6WqVisG2RT8Xp7QDgN9b7GaRys0qNyfJLtArOsmNya8aM/fIEMZxAfUf/TgPLSY/cqIn1Z0ZxkFBdP0G95CAR1Cc7EQKwrCttFSs0+a2hWSzy74s7UIN0sOMtqanuKEdjYfY/B/JjDIiWpTJSjyFuD0pZ0WtPqBoYTZrd7FOdrvOMBct8xCgwiJGrnEs1OGb17hHfPx/MAIQgqn9logJW2PPzWKcWPIvy6I6prqkwgXIU58YksC/94yLbocEd9pMpRSUthpVzHBb13K3ZmR/bxCsMU5FlDvYkx3r/DrYHF7aDEeAdGh89prAr72GCzvaTYplAl1DrB35UIMk7O+iidMqtM/M2BvNkxRmHR55BDmkSkl4Lt9654YAmcr41xfuUOw0cD2tqgvKoIwwG0NcJQjN0+Zuhj5hli4JF0Md4Dm7NHDvYXHKJ2i7j8Cf0vhAzvebz+6ALrCyP8RzOWTks0sujSmvV2Qn77HezliMG9EU4p2L8IOMsesfmRRZdkLF4+pe3GJL/XYXYhqe2QD038L53x7pe/wOL8EyzPY1keiJaa2m44mSr8ucdHTza8PH/B6YM+aeHypV+4ze35HNko8ssdalHh+3cxjo54fnWf7ocm6pXBgzTivW8Limctw4HD+nWOc7vPaO7RorHylmQdUe4yOtOk3sVku5S+fcnv/G/fIYkPuGbIdGCQtDZFtcVoC/LCZCMK5l+bkcUG4JJLCIKWYd++mb+/LUFNubYstH3MX3p8i/iPl4TXKS8+eok8NdmnEco3GQ1O+ew7n7P2a4K7D+jNXM5f7Ji9d4fTUc6HzxOmfom+7PHoF++zGw+46LtkhsV1KTGHiqpu2TYd/v3HGG6PtGzQwqO0BX6o6WLFtOvhnpj4niDAwDMkF3ObT/s1K3vOroaF08foVeCbNHnH2PGhMWkNE91kuJaF0ty06jtgGBroUG9Y4v+89f7nyOR/Ttruf+NvSjPARKPQN/aLzqDRGq0VtC1VUdLxhrvbcWOb6QwcQzE+c1h+eoXT9+iAponJ4opwOuVYDqjvPiC+viQoG7yxx/XBwS4Kdl2BUXVk9RB19yH+6RjPrnnxz/b0py5nt6Yc6hvWxYP3bnMeFeTZOaVjonVD++mK0dmEV69b9nHK9GxOvE/RtcC8O4eiIC06VFbSvzulSFoOkcYSkrp0abOW0HCo/QHjWwHyuqUKPIpPDsRFS5lEnN0aUe9i1pbD9bpHb3CgdVsOm4SBb9OeGvQMg7S2sNwQTxmsthlHpzPSokDVKYZTYR+fcuvuiLrVnHzpGNsJcO4N+fSywB8plltNdt1iVQpRVPRcg0NSouoaZg6qb+D4GlEq2qlNIxrYxFTVmOvfeYq1Sglcg6ZsaS3QfofCRY0M2qzEOAvotKL+vKKdS6plTejbWEOXMk2RUtH1BXUTs6MlzzTHXzZYbRwMaeKWPnEqmPUdWrdle5FhOYpymUKUEIQhh6s9s1s+Va1wB0Br01UtuqdwzBbPVOSxpqnBuatIpEdDg7r4EGfcA69PICuCu33KqwIGkvzHC8auRxB4OKMAbbQ0eYrf69HZBvFyj25sArcmf5FghDbGoMPat8wmfbpJSHl+QLqafpPSPx5QDX0MsyOUNdW6olMtxiIiDwKKwQB5v2HgGsjbDi/PK4ZBQH3IYOCy2GWYhiRaHmiUwVuz++zNBH/mIiUYnk1xyDFkSxkVNFaAO9Ac3R+QP01Y7HOiqCRedjTlimCo+TSwKY7vYFYGM1ezijcUpcflWiD1CsfKKDOL1rBpGk1b1mQ7zaR2ESIlzjWWqRiYkvX2mjgH3aypHJfHj8+ItjbLtCZoc4L5AG13aNtDUHL9IsI4meFZgkNW03kGPVWwv6jozQxuhT5PXlRYO5eTNsEPlhxqgyjxmBy5bJYNttnnwa/dZ/X9n9DVkCwK5qcTZtEV3Uazdiw6y6f/3gOWnz/hKBxRUGBbNXQRj07f5dWT5ygjgrKhqwrG0xCdVei8pqgNkqgmrzTCBekp/KQkNn1U3ZKVW2YyZn+5Q7QmuRQ0RYYf+DihIisqOvnG7+saNE2LwEQ3HUJLpJSUTYchAfEn5YWCLN7//0jPJqBTmlZohCFQQoDsMJTENBSmbRP4Lo7r0HaCzlC0UUXXNRSq5fLiJeN5H/8oJJwriu0Wbyzplq+5cF+RRy6jt6YsXJvzVwlxsuWqaMEbUR5NCN6ek28z2vaGyXz3G4/olgVPnu/Rr14R9DucXs7AWfGoheB5gbyu8SYmtdTcfdzn/tsnRFGOM7CQRkSyuabzNMm2JLzT4/zTBdVuT1B3tJ1AOhZjYwCpol5X7J9K0rGDWXa0tYFqO2xpU7wuKQLNIHsOn/0WjCxsv+HEMpF7n147InkFo6kLVk60LjFM2FwvCKce9C3c44Aq2XC5viYclOR1gT3pEyWaY9visIF+XnHv/T7Dex6dZbDdlfRDj66TZJ8uMAqDvLFoW03bVVR1QZmX6KDj7L0x2rMYDhRV36VtO8pdi1YZI99AhBaeNmg3kqMvOriHCnPRgJMTP0kZjMb4E48oO+A+mOG7faxrh+u/oxFPDXpNwP58i6U6Fpc7rs5zpg8MNtcRVZZz/fme1UfnDG/b7JY5dVSzvDiQrF5TdAVDt6VMMzpS/C6iH0dUCFJiIjLys7ehVRQf7fEDyX6f0HgK43nLnXfvsxh2WG5KrWtkVjOcDuiKhlFoI0YGkzs96rzFHvTxJwGWZVILgyZOkD9ZEZLhGhrz1oTtpqVJSyZ9i+wclB9i0tCGOa3tkV9HmFtJ67bUTsGdb42RZx7u0KLq1/jtHrcrsKhwJgEPvmYRxTlX2wN12uDmmloLymyPNZmTVJrdj/ZETyLCoxHjY4+irbDdhsGpSezb3DsRGPVrzG7L8Oq7lN4nTB9V9LMVP/n9ltcfXbPpMuibxHlHTIXZq0jbDj/dU6cR1tWa1LQJrRnasGhHM0amR57VmNMtprxE+JqDY/D+OxNac0Dx0RZx5JCmHaV24egEGovza4/R3SHPrq55VZ1jPn6bKJzwT1qXR47F9FrirvcQdTx694z+yODDv/8R49MBs77g3lmAt1nRSYePxZC//MtzXD+giV6zWR94tVnQOzJw+w4DecJ2u+Ldh1Pmw/s02qHMCuKqI21z8g6aMuELDy0e3fdpFiWqdG4a5ERBlDU4tmaQj2nzPZ1uUG1O0+bIwEeJhqLt8CzzpkAjyzBt0PWNyUcoje7qGz+tNOg6he7Ez1Wd8ufjSfw3fuNvKs8HqW8qzqUBSqMkiLZBSkknoSrKm0UAw6QuKyxDYhBj1h3S7ZPvU4LehLaSVBWk10uswKaJoPHn+O9rymsTgiG4Y6bvPcDzZrDL8HoGm9UlejQgzCSLrMVUBypDMzq+Q2g3xLFNOi1pXBeRVDieT7yNKVuBEXRcPFkilOL65Z7atFHCJn0Wkd+ZUr480BubXCUZ+X7D/F2gqSiFpG1vSg/9eUixbcncFnPfEMcBjTDgJGF+bLL68Bz1zWuqo9u05z3yskb7Hv6RQBk+xXXNIGgpohbXNrFtl2B089QnTIFj11S9Oaf/9m2ijwo++STDndokhxJhWdR7ePLDJZPbAaEUaNPGqltkoSmFpje2qDcFhgGGFBhYxLMU77TGH2qscUezkIyGimrfYc5t2khj9B3qXYTfF9S2Q/T9HfO/cEJaC0SUYY0ssl3GbHJEto6wIgsZm8zuDnCrguyQMjryaTqNaVvTdRySAAAgAElEQVQM/QBZSqzAxL3l0vtSH//MwjUknQNZ26LKHGfgIdIEJTqKIsUd5zcCYNfg+f9D3Zv82pqlZ16/tdbXd7vf+5xz77lN3LgRkZERkY1Npe0yuFxSgUpMagoTJowQ4m+IykzVhCECCTGAITBAlpAoJIqBEbZVBjttZ2ZkNPfeuM1p99n9/vpvfWsxOMHcQ+f+B77Zu579vr/nebyIBocElwEBmXQJ5y65UQxjKIue+ZOQyy/vEFuPupNEI0lfSLxI4jiS3c2OzIsJPcXm5Rr/NMKPQ2RVIXyLP/Lpq4LHH8SsX/kcK838s1P8QLH+5hb3fMTuqwu6KCBvFbo9kCY968uOYehQrgTaGowqsac9gzzlUB7Zv9ngYTmZhvzqb7/G8zTXr6+ZPH1Muz9QOTVl6ZKMhnhBRfj4GZ4Lx3eX6MbihwEiCPCGHpO7gsldzvTFJaMPC25SxZ/8j/8df/6X/wvhyZaHcUTjCgQV262HNZq+cQiB1bpFenAU0A9iHo1nPPr0Oe5gztEoBo/HHLo9F79ZkvhDhGg58wTtRlANMzp/R/Cxh3umkKsdoj9yPPYMxgnFbUH0kUPpd+AojmZLNXvEV3cR0xZmvUNeVGy+3dG2lgdnIYE357GXkjc1tWw5kMB5yFe/WRJMJBdfbzDZiEm6wN/VVO+2JG5Dvbyjky7ZCIwTEvjQbndoG+AFElM73C0tnutyMJqYFuNpmqDGrw0zGVDdrGkZYosbhpMRd1VPOIjZLnOsMDSNxJcCrRTScxDCglD0vUZ5Csl95jhC3LvuhaD4eyrxfxhD/Kc/+1wEyXfdctB1Pa6r6LsWq/t7V5ex9M0R5d3zq8aRNLsKVVU0akyzaYjfj1j+5hb//HvUVxuwJXpvmMeW/E3LNPuI3/kPnmD6iOl0xPLXS3bfLqlDH+yC7q7mye8+YvN6RRS19FVHX0qi04C4f0cRV9g7SXtoqAPF3UWJ7Rr2bYMTjyherlG9oau3jH1JcbWhHwii3RKvKZGnA07OHSafhLhfvWXbKqJkTl+UOFojvRgv9mnrnnKj6P2Q2n/ITK4Y+Due/55D+S/eY/bXe9o9uKGDynpmVcQy76hdj2ZVI3tx//CJGqfVVEFIM0ipdYAWGZu/y2m9mOChh+s4qFOJQ0f8icPJD3yChUsYNXhYOAuwxiFNoO4tRrkcyxyUj6sKnmU36H+8J20ijjuoO3AzhV9VTLOI2hxwrIenFFJpNm96Jr83pzQ1d//XDY8+G+EoRS19qqLGJYTW4pwOqPyGoJAcq57ytqHOEqI0oK4NtW/RjcL1Hda3O3wcmkOPH0mikwGH1Ra+zXHuSg70xMIy6n1Keb8mOBs9ZUCHwiVC0WBxqVHivmrPhPJeeR+PjCYj0rMx+VVJ8jCjumkxrgVjCB8P6d5cU47Bmw7wK4fGDTGNZrM8UirDts0glhhPIPyOeqWZTmPufrNi8vEI1XUMJx54NV3l4aqOvAkJ0URFiV9vCSNNmAW0q4bG9Lim4lj27MuWfHnNdPoQmfg4iaXaHqgbH7+VHF+sccs1s9kcOxI09MTSYhyJWB8R1TtWNzGz9/4xvVkQuRH/7Z/8z1AlyJsTBpHHO7MjMiV3y5J5sqCdKapVzWDQY61gd3dAuoY6fACBZHO1J4wd3n57Sb3dc+YJ9tLnUV2TiAFSaZpVyWq3Ij4Zon55g/lBRJAIPLcjDHr2uz2iPDInYCFARo9pyjXlYEq4DVnR4/UdY8fidT5ntUVUO6Ku4rCx7DkhOhsh+hJXS5alRWUj0tMhEx+6uuFwOJL4LoUPga3p9/DozGO/MVhX4coO1/WJRy5+EOGoBnUsqGYebmvJa8HpYsy6ifGzhLhY42cz3DSi1pphJDjeHVDCYxEI8j5AG4vTa5AututxXQfbW3QnQFkcxX2r2W/dEP/Zzz53gvtsBjCgBL5S96rcC/B8F9u2aFyUMPS+g3QcrKPoWonjaGw0wugW94FHtz1ghmdw9gwGIcVmh6dKtm9vePOLC/bvvmV3cYMutzCa3H8yLDj90Ce/yumChnq7w69h9jhj+tCjd2D7oqDqNb0CfVORNS25EnTbmnDQ49YbJv/sDOd2zfyfPKKrDgx/EDF4OMJ/dk5xe83Nn99SySH7m4RsFNE0kIQSU0ukBdeVVK/2VAXY6QKjIyYnEbuVptCW/pcJ8o3CMKPrO06fLVh+tWT8yQfYdweapiJ8L+HxjxZYz+NoArRtsZUhHEn62KNvNdE05uTTjM2XF4zGClNDKH2SQ0XdaGRvEAV4UYAeNJRbS3exwvvRA7y7gqrTzAcJ9eIB26OA/ED1tqfNNSqX1DbkcFsgG4giBxM1HNua6IMx6q7CiwxuCu14weE3l3jG4o8ddruKs7MFmpry2OB5ITYJCI7gFR13y4IuMhjVk+CQ7ypOHwaYwiWpOvRWcegsi3jM+McnGOUy/skjJg8esNr4rL+R9FmM0TXELh6CoalYljm1DVFOiBaC4XFHIC3BbI4b3LJ9B9H7KSJq8NsK1/NhX1BcrmnmA7LxhHLf4E9CDi82yEcTgsmI6XSE6yrcROJUAn/m47QFNA3tsaTftIwnHnXeEOeCemPI5YweD68VWBlQixq32eGkHnkhcE2F7xyhK/DdigffmxM/S9jdbKlrsKOMVPV4iwWdKnGGY5R10ds9VdsS+CkkCv+Q03QVDx8K/rflnBf8ezQvC96P51SvXK7VP+erTc48sqyEg9gYhrMB1bsD3oMFV1c5I+NyGrl8ve6o81es0LTGcmgOyMMKN79DipCnoUvWKxzXsDGaptmRzzO2v6lpx9/HuS5Y/dsLorMBN1/38NlnlDcenhfy8uWGH7sb6tfXREPBrVOifvcZ16WD/vScTSG5zR30eUZz/ow8UaTPXY63b1mtWjqlMfEQEfZkqy3Xu4b5+Yf88MdP6SYzSFNWTkdsC3IvZqwEfttT7jfsbMPduiJzE7rYx/YRj9sci8KdBWz3mupwJJEp6eMB2hqKpsNPPPqiJgoFJogxvkW3GmlAevd5KbZX2F6DlAgpcBT0Hd8dOCXFcfvbNcRVmN5TKlqDEHjc56UICbrtEVLS2w4h7p2MDhIRKXop4XhDqDzs0SXyWsL5gTQVOLuSpk6ZPHpC8t4J+V0F8Yz4vQWtM8b3fMYpuFnHcOxy9e5AEFiKuy1+4tA4KfEn50R+gU0Nx22Pki51K/nwRycUH/rEYcCD90/xHYd4aAkHBqylOLgowFwVXPzpDlVXzJ9kDB6e8t7cp6tbuqSjVwqpLGXdMhjFmHOXw8uSWPTUiYc97dlf3YFn0G1K2BkyPPqJonhyDgLqXUF3aDm2DZEfED8NOawOCA90VWB9B3egMElIc5szCwLcBxPK31yTRSH9Aw9nmKB0Tu0m6EtLu7tPW5O+BzclwzAg/3dGREWHN4rxO8FBZLSuYtQbnNrgJDMid0hZCUaDjHxbkKYOzdcX+M8j/NGEkTfARh365R7vA5c+18S9wtxYmnPFdDgDt6LKa+KJi+wSBpFk20GuGwaDCPlVjpx59JmDHygaLditW4LAo7OGQSBZX9zx6osdW6Mp1x1/+b+/oTk2JD8YgbWIbER/u0RtGi7bEiNDniVjQmJ6UgL/A3JVIpsVZSmJjg3t6yO7Vz2uH7J+ccCOEjhbIISCsUscScpDTeNKWFeUdwf62tI2Fs81WKsQfUV1tcULA7aXBWkcMpknOAVo6+CNhoSix7ESMQto5xnhsCU6sVxLh7nvofoQOb8vzIg+mKAPkrvXDcNsSq9i7DFn2w24/uUrHK8g9GP61rDd7qAtEY6LEi7triBSltc3EgJL3eaE61+zCATvjMf/8Vf/JX/y3/xrbtMjcb1h01pG0wWmKnEcjffqNUvZIX2Hk9hSxiOczR357RK9OxKJjkmW4EcDqn0FvcWJXN76Ple7HPfuwDRq6G+WiPEJMknwjaS4uWXhuUTRFO1tEc8HXBxHuI9T/mieIxcNfH3F2fOU8tsN0x/ExPWa4GrJtztJ3jdgLlk1QwaPnlBVNaffe4TaboiNR8SQ80VC3na49OxvjoyDhLu7DeVyS+tHRNagbY/uDGE2wmt7mt6i9zuuogkPBGw3O/ToCZRHpKO5vdkhvQ6BzyBxkc2BWkukH0Lb0jU9uD5GtzhKYg04jrgnUr4rhZD3ZnWsgTL/LVLin//0Xok78t5yLxEoa+mEoO8NRmicwKeve1Tv3l95xf1rJXWPfzqmKCXUIWWd0C43PJxK+u2W+WdrumKLYyTZswav7Jk/lejbksDZw6kgHASsip5QCXxlyPOKTiUsRhnTmaCqd+zflAR+gm06RqMB218UtB9l9H9W4Tsux8s7to2g3EUc7nbM9w3daMCNF/L4I0P2nsvVmw693HN9sSd4EiJWFf4EbKuwbo8qFc1VRdcptOeS9AVumTM5GRONYsJgSBk/odUnlNEl8plid1OS9C0mGcC+YPDeCFfn8DTEfdfTNIbq0JAoRfgo5XBxxIsDwvcypK9YX6+Q25Do0RZ96rJpYo7ynGYGrj9HrW8IkwXaerh4NPuKyShkpSyjTMFaoicD1lc9+zsPsU0wOqV4cyCuWoqRz/iTEKQg7WKarkZEPuHQo9WC5NQQug7qwwHVqx3TkyFSeehUUdw2JBOP7XXF8DRkMDxltdyjhwF237GVPpPOULfQ7Frk0FKsWppDhbU74hC6uqZsDZMPppz+5BG2KLCdh6ob0jOP9WWHdGE6jFB1zrrLsIeGPDowYYJxHNrAZTA+QZ0OGC4i3JMB0WnA9L05KjRorXCQHN61yFDg0+MFASJ1MbpG1Jr+KAgelLRtTVu1pKmkKyyLJwPGo5S8cfD8E3bxgPjpGZN9TmNrEhXAiSEvC2YyZPfLO9p5gOoFq1rQ1obduwOT7CnHbcFhvaTt4XSgwOnBdcEJSTKHrjNU+xwvdPHmAbpSmLLk+4981l+tOFa3ZO4WmWmqqOS//i/+Fen3WqTccpKGbPaWSCY8fZjh5pYi8/B8wfV1zflQ4DVzwkgwzGY8G1iEdBB2wASXXkCURlSJT2ca+ptLHr1/htA9w7DCFAVeNMZEisHYYfvqiqLuGHg9Y88grlYU0Yjb1ZRkfMrydcPJwCWrDWOjSE9GBO/PER9+gHF3bC/3xLPHHPcdVFuq9RFn3xH7GYNJx757R9V4dHlPIHrqrqHetSRTn83NnvHDFCcJaDYHqmGCtpbQj/CnPkVoEe2AyUMHvd5wCH2a4r77IJtPqQ9H8tUSdxTjuA5drem1AC/AcQRSulhjUMpgrbzPhxIKawxGCJR0wUKR/xYp8Z/+7OefqzC5d2V2PcZa/n9wRrouXhghG0tnQGBxfI9w7KN1jVQNbb/DuAZTrsBzMI7LvgiIW0Hb1RxvDoRFTCuPDNWKXBwRO02fRSA96kJRXlcYHI61JlEpQjh88rsjVt/esS9rhGOxVYBwFcJJOawbnrsO+w7K+YLV2uP0wcdsS4f0ey6dbmkfhsSJpP2N5OLrgsAEyA9ntEfB8GHA5lozfBoTtoZat4SBy3HbEY1czK6miWLylYvsenqVkh8iqtdHBrbHf6LwvuoYlgEqGXK82DCapkRPHJyFIbhVVLZnEGZ0VUuYjXBTl7gy7N9WTBY+4Q8y2m0NHZz8cIDhSLQ942b4O4joh0xmNUcREdxsKQd3pM8GTP2Mb77ZMQt90qHHpjU4XYZDSOZPcMcp0+cJyQOfLvQ5eerjxgPKykX2hn0nKL7qEJOAzS8rvPmAy7+4YfLDIZXW7F/scRYD3v3lJaefTGjWNUJqhJFs9g0mlEwcjaw9zHWFfS8mrQxOb+nrGn2zYbXbk5Ni6w4vDCEdMRilYCTXL47IzMUNPXxz4Oq64WTq06ke/+6Gu8UPqaOM5OZLqmTJCjB45JsbotBl5xgO1zk3VzuGiylfvrtiMvQ57BqkEoS+j646RBSglGUUNvclxyc9OQeGnaWqGrqlQKYBYeDTpyOatab2ex7O5+jzf5/LbyPCsMS82lJXOeMfara/3rG6dhmEPod1z7i1bC7h9PwMXR3wzn0YxrjTjNXFkV//6iXzH/+ISau5uLqlFZZo5LK5aoimigiHps5xfI/5WcQjUuYnT/BCRVe5jD8bY7kmpCc/Qn21ZX7+mCDfkT6a8uJXO+TtmrOPTtnuD0zTguVOM81CtuIhaaBIbYfWDSoLKQ8ty/ya3aaBwKFsfKJBzLYMMHtN21Rs3ywJTk9IBiFZ7HD5zR3NrsMLNGd+ye23O6YDQVu+I1Euy3VJ0Rf87YsLtIjYvPgCf+zTXm+odw0nYwn2SNSEFF3OeOaT395hREGxrTju1uw3O+pagm6xTc58GmObjrLsOdY9rtFsbo+cjVJudxXc7tF1z2ic4QYlI1cgxQHfc3G7HWHS03oC5bt0tUKpHouD6XpwBbYXGA1S3oekWCtBcr9iwWKNRQpJcdz89gzxz3/688+lnwAG4TpYLEq6KKVQFmRrcBwH7ViktQx8i7Zg+iOy13RWY60LD2ewKsBY+kRx5CH50UOHMV195LDW1NmM4+2GYD5AypJ+DaYpaNsa37ckhaa0e4aPfGJHc3fIGU1jVjeaxPdQgwxrXWpP0m4NFCWDqGI0rGhfXhJbgdpZ2NUkrkP/FxfEXsbZH2TUlwfsxZF4taW0Lo5N2PzVWwZPTnCtoZGKetuhXJfRfEoqHRaDmtlC0l5uMIcVT/7pgtbbMAhagsin3O+pV3vCRxnx6P4vnJ807O96/F5R9QVBOMAfWSotmX2asr1uyJ4NcZMh3srib7YkHyrMhYNrJ+gbn8XuIfGTvyZt97BsGDwf4ivL7ssV2YMRddtRFQZxdyQYTOGLkIE+kB92HHqD9Hs+/nBKaSzF1Y5sFrB/fUBNQoL5HJkEZLYl8O4HyeCjFCklk0jQ6ZLBQBDiUrT33K7rR/eP27ahV0NU6NCs1ziTENP6eLH
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:98cc7a5c65809d230cb3135888110b3be173cd62215988ca389d42ddbc956b42
|
| 3 |
+
size 61440
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
23. DeepDream & Neural Style Transfers/23.1 DeepDream.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
23. DeepDream & Neural Style Transfers/24. Neural Style Transfer.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
24. GANS_Generative_Networks/MNIST DCGAN.ipynb
CHANGED
|
@@ -1,918 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {
|
| 6 |
-
"colab_type": "text",
|
| 7 |
-
"id": "_jQ1tEQCxwRx"
|
| 8 |
-
},
|
| 9 |
-
"source": [
|
| 10 |
-
"##### Copyright 2019 The TensorFlow Authors."
|
| 11 |
-
]
|
| 12 |
-
},
|
| 13 |
-
{
|
| 14 |
-
"cell_type": "code",
|
| 15 |
-
"execution_count": 1,
|
| 16 |
-
"metadata": {
|
| 17 |
-
"cellView": "form",
|
| 18 |
-
"colab": {},
|
| 19 |
-
"colab_type": "code",
|
| 20 |
-
"id": "V_sgB_5dx1f1"
|
| 21 |
-
},
|
| 22 |
-
"outputs": [],
|
| 23 |
-
"source": [
|
| 24 |
-
"#@title Licensed under the Apache License, Version 2.0 (the \"License\");\n",
|
| 25 |
-
"# you may not use this file except in compliance with the License.\n",
|
| 26 |
-
"# You may obtain a copy of the License at\n",
|
| 27 |
-
"#\n",
|
| 28 |
-
"# https://www.apache.org/licenses/LICENSE-2.0\n",
|
| 29 |
-
"#\n",
|
| 30 |
-
"# Unless required by applicable law or agreed to in writing, software\n",
|
| 31 |
-
"# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
|
| 32 |
-
"# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
|
| 33 |
-
"# See the License for the specific language governing permissions and\n",
|
| 34 |
-
"# limitations under the License."
|
| 35 |
-
]
|
| 36 |
-
},
|
| 37 |
-
{
|
| 38 |
-
"cell_type": "markdown",
|
| 39 |
-
"metadata": {
|
| 40 |
-
"colab_type": "text",
|
| 41 |
-
"id": "rF2x3qooyBTI"
|
| 42 |
-
},
|
| 43 |
-
"source": [
|
| 44 |
-
"# Deep Convolutional Generative Adversarial Network"
|
| 45 |
-
]
|
| 46 |
-
},
|
| 47 |
-
{
|
| 48 |
-
"cell_type": "markdown",
|
| 49 |
-
"metadata": {
|
| 50 |
-
"colab_type": "text",
|
| 51 |
-
"id": "ITZuApL56Mny"
|
| 52 |
-
},
|
| 53 |
-
"source": [
|
| 54 |
-
"This tutorial demonstrates how to generate images of handwritten digits using a [Deep Convolutional Generative Adversarial Network](https://arxiv.org/pdf/1511.06434.pdf) (DCGAN). The code is written using the [Keras Sequential API](https://www.tensorflow.org/guide/keras) with a `tf.GradientTape` training loop."
|
| 55 |
-
]
|
| 56 |
-
},
|
| 57 |
-
{
|
| 58 |
-
"cell_type": "markdown",
|
| 59 |
-
"metadata": {
|
| 60 |
-
"colab_type": "text",
|
| 61 |
-
"id": "2MbKJY38Puy9"
|
| 62 |
-
},
|
| 63 |
-
"source": [
|
| 64 |
-
"## What are GANs?\n",
|
| 65 |
-
"[Generative Adversarial Networks](https://arxiv.org/abs/1406.2661) (GANs) are one of the most interesting ideas in computer science today. Two models are trained simultaneously by an adversarial process. A *generator* (\"the artist\") learns to create images that look real, while a *discriminator* (\"the art critic\") learns to tell real images apart from fakes.\n",
|
| 66 |
-
"\n",
|
| 67 |
-
"\n",
|
| 68 |
-
"\n",
|
| 69 |
-
"During training, the *generator* progressively becomes better at creating images that look real, while the *discriminator* becomes better at telling them apart. The process reaches equilibrium when the *discriminator* can no longer distinguish real images from fakes.\n",
|
| 70 |
-
"\n",
|
| 71 |
-
"\n",
|
| 72 |
-
"\n",
|
| 73 |
-
"This notebook demonstrates this process on the MNIST dataset. The following animation shows a series of images produced by the *generator* as it was trained for 50 epochs. The images begin as random noise, and increasingly resemble hand written digits over time.\n",
|
| 74 |
-
"\n",
|
| 75 |
-
"\n",
|
| 76 |
-
"\n",
|
| 77 |
-
"To learn more about GANs, we recommend MIT's [Intro to Deep Learning](http://introtodeeplearning.com/) course."
|
| 78 |
-
]
|
| 79 |
-
},
|
| 80 |
-
{
|
| 81 |
-
"cell_type": "markdown",
|
| 82 |
-
"metadata": {
|
| 83 |
-
"colab_type": "text",
|
| 84 |
-
"id": "e1_Y75QXJS6h"
|
| 85 |
-
},
|
| 86 |
-
"source": [
|
| 87 |
-
"### Import TensorFlow and other libraries"
|
| 88 |
-
]
|
| 89 |
-
},
|
| 90 |
-
{
|
| 91 |
-
"cell_type": "code",
|
| 92 |
-
"execution_count": 2,
|
| 93 |
-
"metadata": {
|
| 94 |
-
"colab": {},
|
| 95 |
-
"colab_type": "code",
|
| 96 |
-
"id": "WZKbyU2-AiY-"
|
| 97 |
-
},
|
| 98 |
-
"outputs": [],
|
| 99 |
-
"source": [
|
| 100 |
-
"import tensorflow as tf"
|
| 101 |
-
]
|
| 102 |
-
},
|
| 103 |
-
{
|
| 104 |
-
"cell_type": "code",
|
| 105 |
-
"execution_count": 3,
|
| 106 |
-
"metadata": {
|
| 107 |
-
"colab": {},
|
| 108 |
-
"colab_type": "code",
|
| 109 |
-
"id": "wx-zNbLqB4K8"
|
| 110 |
-
},
|
| 111 |
-
"outputs": [
|
| 112 |
-
{
|
| 113 |
-
"data": {
|
| 114 |
-
"text/plain": [
|
| 115 |
-
"'2.1.0'"
|
| 116 |
-
]
|
| 117 |
-
},
|
| 118 |
-
"execution_count": 3,
|
| 119 |
-
"metadata": {},
|
| 120 |
-
"output_type": "execute_result"
|
| 121 |
-
}
|
| 122 |
-
],
|
| 123 |
-
"source": [
|
| 124 |
-
"tf.__version__"
|
| 125 |
-
]
|
| 126 |
-
},
|
| 127 |
-
{
|
| 128 |
-
"cell_type": "code",
|
| 129 |
-
"execution_count": 8,
|
| 130 |
-
"metadata": {
|
| 131 |
-
"colab": {},
|
| 132 |
-
"colab_type": "code",
|
| 133 |
-
"id": "YzTlj4YdCip_"
|
| 134 |
-
},
|
| 135 |
-
"outputs": [],
|
| 136 |
-
"source": [
|
| 137 |
-
"# To generate GIFs\n",
|
| 138 |
-
"!pip install -q imageio"
|
| 139 |
-
]
|
| 140 |
-
},
|
| 141 |
-
{
|
| 142 |
-
"cell_type": "code",
|
| 143 |
-
"execution_count": 9,
|
| 144 |
-
"metadata": {
|
| 145 |
-
"colab": {},
|
| 146 |
-
"colab_type": "code",
|
| 147 |
-
"id": "YfIk2es3hJEd"
|
| 148 |
-
},
|
| 149 |
-
"outputs": [],
|
| 150 |
-
"source": [
|
| 151 |
-
"import glob\n",
|
| 152 |
-
"import imageio\n",
|
| 153 |
-
"import matplotlib.pyplot as plt\n",
|
| 154 |
-
"import numpy as np\n",
|
| 155 |
-
"import os\n",
|
| 156 |
-
"import PIL\n",
|
| 157 |
-
"from tensorflow.keras import layers\n",
|
| 158 |
-
"import time\n",
|
| 159 |
-
"\n",
|
| 160 |
-
"from IPython import display"
|
| 161 |
-
]
|
| 162 |
-
},
|
| 163 |
-
{
|
| 164 |
-
"cell_type": "markdown",
|
| 165 |
-
"metadata": {
|
| 166 |
-
"colab_type": "text",
|
| 167 |
-
"id": "iYn4MdZnKCey"
|
| 168 |
-
},
|
| 169 |
-
"source": [
|
| 170 |
-
"### Load and prepare the dataset\n",
|
| 171 |
-
"\n",
|
| 172 |
-
"You will use the MNIST dataset to train the generator and the discriminator. The generator will generate handwritten digits resembling the MNIST data."
|
| 173 |
-
]
|
| 174 |
-
},
|
| 175 |
-
{
|
| 176 |
-
"cell_type": "code",
|
| 177 |
-
"execution_count": 10,
|
| 178 |
-
"metadata": {
|
| 179 |
-
"colab": {},
|
| 180 |
-
"colab_type": "code",
|
| 181 |
-
"id": "a4fYMGxGhrna"
|
| 182 |
-
},
|
| 183 |
-
"outputs": [],
|
| 184 |
-
"source": [
|
| 185 |
-
"(train_images, train_labels), (_, _) = tf.keras.datasets.mnist.load_data()"
|
| 186 |
-
]
|
| 187 |
-
},
|
| 188 |
-
{
|
| 189 |
-
"cell_type": "code",
|
| 190 |
-
"execution_count": 11,
|
| 191 |
-
"metadata": {
|
| 192 |
-
"colab": {},
|
| 193 |
-
"colab_type": "code",
|
| 194 |
-
"id": "NFC2ghIdiZYE"
|
| 195 |
-
},
|
| 196 |
-
"outputs": [],
|
| 197 |
-
"source": [
|
| 198 |
-
"train_images = train_images.reshape(train_images.shape[0], 28, 28, 1).astype('float32')\n",
|
| 199 |
-
"train_images = (train_images - 127.5) / 127.5 # Normalize the images to [-1, 1]"
|
| 200 |
-
]
|
| 201 |
-
},
|
| 202 |
-
{
|
| 203 |
-
"cell_type": "code",
|
| 204 |
-
"execution_count": 12,
|
| 205 |
-
"metadata": {
|
| 206 |
-
"colab": {},
|
| 207 |
-
"colab_type": "code",
|
| 208 |
-
"id": "S4PIDhoDLbsZ"
|
| 209 |
-
},
|
| 210 |
-
"outputs": [],
|
| 211 |
-
"source": [
|
| 212 |
-
"BUFFER_SIZE = 60000\n",
|
| 213 |
-
"BATCH_SIZE = 256"
|
| 214 |
-
]
|
| 215 |
-
},
|
| 216 |
-
{
|
| 217 |
-
"cell_type": "code",
|
| 218 |
-
"execution_count": 13,
|
| 219 |
-
"metadata": {
|
| 220 |
-
"colab": {},
|
| 221 |
-
"colab_type": "code",
|
| 222 |
-
"id": "-yKCCQOoJ7cn"
|
| 223 |
-
},
|
| 224 |
-
"outputs": [],
|
| 225 |
-
"source": [
|
| 226 |
-
"# Batch and shuffle the data\n",
|
| 227 |
-
"train_dataset = tf.data.Dataset.from_tensor_slices(train_images).shuffle(BUFFER_SIZE).batch(BATCH_SIZE)"
|
| 228 |
-
]
|
| 229 |
-
},
|
| 230 |
-
{
|
| 231 |
-
"cell_type": "markdown",
|
| 232 |
-
"metadata": {
|
| 233 |
-
"colab_type": "text",
|
| 234 |
-
"id": "THY-sZMiQ4UV"
|
| 235 |
-
},
|
| 236 |
-
"source": [
|
| 237 |
-
"## Create the models\n",
|
| 238 |
-
"\n",
|
| 239 |
-
"Both the generator and discriminator are defined using the [Keras Sequential API](https://www.tensorflow.org/guide/keras#sequential_model)."
|
| 240 |
-
]
|
| 241 |
-
},
|
| 242 |
-
{
|
| 243 |
-
"cell_type": "markdown",
|
| 244 |
-
"metadata": {
|
| 245 |
-
"colab_type": "text",
|
| 246 |
-
"id": "-tEyxE-GMC48"
|
| 247 |
-
},
|
| 248 |
-
"source": [
|
| 249 |
-
"### The Generator\n",
|
| 250 |
-
"\n",
|
| 251 |
-
"The generator uses `tf.keras.layers.Conv2DTranspose` (upsampling) layers to produce an image from a seed (random noise). Start with a `Dense` layer that takes this seed as input, then upsample several times until you reach the desired image size of 28x28x1. Notice the `tf.keras.layers.LeakyReLU` activation for each layer, except the output layer which uses tanh."
|
| 252 |
-
]
|
| 253 |
-
},
|
| 254 |
-
{
|
| 255 |
-
"cell_type": "code",
|
| 256 |
-
"execution_count": 14,
|
| 257 |
-
"metadata": {
|
| 258 |
-
"colab": {},
|
| 259 |
-
"colab_type": "code",
|
| 260 |
-
"id": "6bpTcDqoLWjY"
|
| 261 |
-
},
|
| 262 |
-
"outputs": [],
|
| 263 |
-
"source": [
|
| 264 |
-
"def make_generator_model():\n",
|
| 265 |
-
" model = tf.keras.Sequential()\n",
|
| 266 |
-
" model.add(layers.Dense(7*7*256, use_bias=False, input_shape=(100,)))\n",
|
| 267 |
-
" model.add(layers.BatchNormalization())\n",
|
| 268 |
-
" model.add(layers.LeakyReLU())\n",
|
| 269 |
-
"\n",
|
| 270 |
-
" model.add(layers.Reshape((7, 7, 256)))\n",
|
| 271 |
-
" assert model.output_shape == (None, 7, 7, 256) # Note: None is the batch size\n",
|
| 272 |
-
"\n",
|
| 273 |
-
" model.add(layers.Conv2DTranspose(128, (5, 5), strides=(1, 1), padding='same', use_bias=False))\n",
|
| 274 |
-
" assert model.output_shape == (None, 7, 7, 128)\n",
|
| 275 |
-
" model.add(layers.BatchNormalization())\n",
|
| 276 |
-
" model.add(layers.LeakyReLU())\n",
|
| 277 |
-
"\n",
|
| 278 |
-
" model.add(layers.Conv2DTranspose(64, (5, 5), strides=(2, 2), padding='same', use_bias=False))\n",
|
| 279 |
-
" assert model.output_shape == (None, 14, 14, 64)\n",
|
| 280 |
-
" model.add(layers.BatchNormalization())\n",
|
| 281 |
-
" model.add(layers.LeakyReLU())\n",
|
| 282 |
-
"\n",
|
| 283 |
-
" model.add(layers.Conv2DTranspose(1, (5, 5), strides=(2, 2), padding='same', use_bias=False, activation='tanh'))\n",
|
| 284 |
-
" assert model.output_shape == (None, 28, 28, 1)\n",
|
| 285 |
-
"\n",
|
| 286 |
-
" return model"
|
| 287 |
-
]
|
| 288 |
-
},
|
| 289 |
-
{
|
| 290 |
-
"cell_type": "markdown",
|
| 291 |
-
"metadata": {
|
| 292 |
-
"colab_type": "text",
|
| 293 |
-
"id": "GyWgG09LCSJl"
|
| 294 |
-
},
|
| 295 |
-
"source": [
|
| 296 |
-
"Use the (as yet untrained) generator to create an image."
|
| 297 |
-
]
|
| 298 |
-
},
|
| 299 |
-
{
|
| 300 |
-
"cell_type": "code",
|
| 301 |
-
"execution_count": 15,
|
| 302 |
-
"metadata": {
|
| 303 |
-
"colab": {},
|
| 304 |
-
"colab_type": "code",
|
| 305 |
-
"id": "gl7jcC7TdPTG"
|
| 306 |
-
},
|
| 307 |
-
"outputs": [
|
| 308 |
-
{
|
| 309 |
-
"data": {
|
| 310 |
-
"text/plain": [
|
| 311 |
-
"<matplotlib.image.AxesImage at 0x29e15c11dc8>"
|
| 312 |
-
]
|
| 313 |
-
},
|
| 314 |
-
"execution_count": 15,
|
| 315 |
-
"metadata": {},
|
| 316 |
-
"output_type": "execute_result"
|
| 317 |
-
},
|
| 318 |
-
{
|
| 319 |
-
"data": {
|
| 320 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAYHElEQVR4nO2de4yV5bXGn8X9WgQRpNyvKl4QHalakEsrAXuh1NSWpJaTWjG2JjXxD1s1qWnTxJyc2tr2pCk9JVKLkCZKpRUVVIq2WsqIym0QEIfrcJH7bYSBdf6Y7Tlo533WdPaw96Tv80vIntnPrP29fHs/8+3Z611rmbtDCPHvT6tyL0AIURpkdiEyQWYXIhNkdiEyQWYXIhPalPJgnTp18m7duiX1Vq347x6WOYiyCq1bt6Z6dOxTp041+bHPnj1L9Y4dO1L95MmTVDezJmlA8WuPqKurS2odOnSgsadPn6b6mTNnqM6e03bt2hX12Oz/FR07Ov4HH3xAYxmHDh3CiRMnGnzSizK7mU0B8BiA1gD+x90fYT/frVs3zJw5M6l36dKFHo89AdEJYr9kAKBz585U37ZtW1K74IILaOyxY8eoPnLkSKpv2LCB6m3apJ/G6EXXo0cPqh89epTq0S+T/fv3J7VLLrmExu7evZvqR44coTp7Tvv3709jDx48SHX2/wLiX2QDBw5MatXV1TSW8etf/zqpNfltvJm1BvDfAKYCGAlghpnxV60QomwU8zf7GACb3X2Lu58CsADAtOZZlhCiuSnG7H0BbD/n+x2F+z6Cmc0ys0ozqzxx4kQRhxNCFEMxZm/oj7V/+pTM3We7e4W7V3Tq1KmIwwkhiqEYs+8AcO6nHP0A7CpuOUKI80UxZl8JYLiZDTazdgC+BmBR8yxLCNHcNDn15u51ZnYPgBdQn3qb4+7rWEyrVq1oei1K87B0Bks/AXGKaMuWLVRnaZy1a9fS2FGjRlH9tddeo/pll11G9cOHD1OdsX79eqpH+xci/XOf+1xSW7eOvlzCtGD0Z+HmzZupzohSue+++y7Vx4wZQ/WtW7cmtWgPwPbt25May/8XlWd398UAFhfzGEKI0qDtskJkgswuRCbI7EJkgswuRCbI7EJkgswuRCaUtJ4dKK7GmOU+Wb15Yx47KkNlZawjRowo6rGj/QW1tbVUP3DgQFJr27YtjY3Kc/v06UP19957j+rLly9Pajt27KCx0XldtWoV1Vnp8DvvvENjo/LbiooKqkclsPv27UtqQ4YMobHstcz2k+jKLkQmyOxCZILMLkQmyOxCZILMLkQmyOxCZEJJU29mRlNvUfrs4osvTmpRu+Xu3btTfe/evVTfuXNnUuvatSuNHTRoENWjtsVXXnkl1Vn665Of/CSNjf7fo0ePpvrw4cOpXlVVldRYh1UgTpdGraZZG+woJRm12I7Kkj//+c9TnXUM7tevH40dNmxYUmvfvn1S05VdiEyQ2YXIBJldiEyQ2YXIBJldiEyQ2YXIBJldiEwoaZ7d3WkuvVevXjSexUallqwMFAC+/vWvU/2VV15JatFY42ja6I033kj15557juos7xq1uZ40aRLVX3/9daq/+eabVGd5/miS6q5dfOZI1EJ7/PjxSS3Kk0dE+zZqamqofvnllye1qI012/PB9h7oyi5EJsjsQmSCzC5EJsjsQmSCzC5EJsjsQmSCzC5EJpQ0zx6NbI7qupkexV5zzTVU//3vf091VmP8xBNP0Ni77rqL6tEegKjOf8WKFUltwoQJRT026z8AxDXnbJR2x44daWxUr37w4EGqs3bRUf+DaCRz1MY62r+wcuXKpBb1IGB7G1iOviizm1k1gKMAzgCoc3feTFsIUTaa48o+0d3fb4bHEUKcR/Q3uxCZUKzZHcASM3vDzGY19ANmNsvMKs2s8vjx40UeTgjRVIp9G/9pd99lZr0ALDWzDe7+kYoRd58NYDYA9O/f34s8nhCiiRR1ZXf3XYXbvQAWAhjTHIsSQjQ/TTa7mXU2s64ffg1gMgBeTymEKBvFvI3vDWBhYURsGwBPuvvzLKCuro72KS8m5xv1Zl+6dCnVo7HJrD/6pZdeSmOrq6ub/NhAPFb5xIkTSW3Tpk00NhrJHNWURz3t//jHPya1qA9ANOp64sSJVGc15e+/zxNIo0aNonq0f4GNZAaADh06JLWoTp/FMo802ezuvgUAPyNCiBaDUm9CZILMLkQmyOxCZILMLkQmyOxCZEJJS1zbtWuHAQMGJPWove91112X1Gpra2nsVVddRfXOnTtTnZVy1tXV0dioZDEq9Zw6dSrVWWqvR48eNDY65yNHjqQ6a7ENAA8//HBS+/nPf05jp0+fTvUoXcrKkqN0ZjSGm5XPAnHpL9N3795NYwcPHtykx9WVXYhMkNmFyASZXYhMkNmFyASZXYhMkNmFyASZXYhMKGmeva6uDocOHUrqY8eOpfEbNmxIahdeeCGNjVoHR7ns73//+0nt1ltvpbFRiWvUrnnjxo1UZyWu7JwBQEUFbwi8YMECqo8Zw/uVsHw0yxcDwOrVq6kewcZ4F0qzk9xwww1Uv+KKK6j+4osvUp2V0EZt0dm+DPZa0pVdiEyQ2YXIBJldiEyQ2YXIBJldiEyQ2YXIBJldiEwo+chm1gY3qhEeNmxYUlu7lresnzx5MtXvv/9+qg8ZMiSpPf744zT2zjvvpPr8+fOpvn//fqqztU2ZMoXGRnn0aHRx1Kr6Rz/6UVKL/l/RSOZnn32W6g899BDVGQsXLqT6N7/5TarfdtttVH/77beT2pEjR2gsO+dsz4Wu7EJkgswuRCbI7EJkgswuRCbI7EJkgswuRCbI7EJkgrl7yQ7Wr18/v+eee5J6lF9ktfBRb/aoRrhbt25UZ+ODo1r6aG1RbXVUk85o3bo11bdv3071qG575cqVVD99+nRSGzduHI2NzsvixYupzs571EOA9ZwHgFdffZXq0RhvNtKZ7ZsA+LjpefPmYc+ePQ2euPDKbmZzzGyvma09574eZrbUzDYVbrtHjyOEKC+NeRv/OICPb8P6HoCX3H04gJcK3wshWjCh2d39FQAHPnb3NABzC1/PBfClZl6XEKKZaeoHdL3dvQYACre9Uj9oZrPMrNLMKo8fP97EwwkhiuW8fxrv7rPdvcLdK6LhiUKI80dTzb7HzPoAQOF2b/MtSQhxPmiq2RcBmFn4eiaAZ5pnOUKI80VYz25m8wFMANDTzHYA+AGARwD8wczuALANwFcaczAzQ/v27ZP65s2bafy3vvWtpLZ06VIa+4UvfIHqP/vZz6g+adKkpBbN8ma1ywCwa9cuqrMe4wCvC4/y7NHa169fT/XPfvazVH/iiSeSWvQZzpYtW6jO+hsA/P8W7buIzlv0nES5craHIKrjr6mpSWpsX0NodnefkZA+E8UKIVoO2i4rRCbI7EJkgswuRCbI7EJkgswuRCaUtJX02bNnaakoS28BvOXy1VdfTWPnzZtH9Wh08YABA5Lanj17aCwrSQTids9RfP/+/ZMaay0MgLb2BuJW0lGJ7CWXXJLURo4cSWNPnTpF9a1bt1K9T58+Se3w4cM0NnpOd+zYQfV27dpRvba2tkkawMtnly9fntR0ZRciE2R2ITJBZhciE2R2ITJBZhciE2R2ITJBZhciE0qaZ2/Tpg1tu7x69Woa/6lPfSqpVVVV0djrr7+e6tGI3osuuiipvfnmmzT2xhtvpPqBAx9v8fdRzp49S/VHH300qUXltbfffjvVWckkALz88stUv++++5JadN6i10O0h+Dmm29Oahs3bqSxK1asoPrw4cOpPnToUKqzMtbosZ9++umkdvTo0aSmK7sQmSCzC5EJMrsQmSCzC5EJMrsQmSCzC5EJMrsQmVDSPPupU6do/fOgQYNoPBvBG7VjjuqPI1iNcdT6d82aNVSP2hazUdUAcOeddya1Bx54gMZG46SjPH2UE/7LX/6S1Hr37k1jR4wYQfULLriA6mwsczQe/JZbbqF6VOe/e/duqrM8/PPPP09jr7322qTG9g/oyi5EJsjsQmSCzC5EJsjsQmSCzC5EJsjsQmSCzC5EJpQ0z96uXTsMHjw4qb/++us0nvURj3L0vXr1ono0Pnjfvn1JjY2hBoBx48ZR/eTJk1T/4IMPqM6OH40Wjo49cOBAqke19qy+Onq+oz0AEydOpPqCBQuSGns+gXhvwyc+8Qmq9+vXj+o7d+5MaldccQWNZXMC2ByA8MpuZnPMbK+ZrT3nvofNbKeZvVX4x3cgCCHKTmPexj8OoKGRJT9196sL/xY377KEEM1NaHZ3fwUA75skhGjxFPMB3T1mtrrwNr976ofMbJaZVZpZJZvzJoQ4vzTV7L8CMBTA1QBqAPwk9YPuPtvdK9y9okuXLk08nBCiWJpkdnff4+5n3P0sgN8AGNO8yxJCNDdNMruZnTsLdzqAtamfFUK0DMzd+Q+YzQcwAUBPAHsA/KDw/dUAHEA1gLvcvSY6WN++ff3uu+9O6lE+mc3UjuZht23bluoTJkyg+ty5c5NaNBt+3bp1VI96t//iF7+g+nXXXZfU/v73v9PYa665huqrVq2iOpsVDvA+AqNHj6axrFc/AFRXV1O9rq4uqXXt2pXGRnPno3r1aF8H279w5swZGsvmDMybNw979uyxhrRwU427z2jg7t9GcUKIloW2ywqRCTK7EJkgswuRCTK7EJkgswuRCSUf2czSKVHLZTai96qrrqKxU6Y0VMvz/3z729+m+rRp05LavffeS2M3bNhA9ccee4zqUXrsnXfeSWq//OUvaezy5cupHo1kjkpce/TokdSi87J2Ld++EZXfsvO2ZMkSGhuN2a6srKR6RUUF1efNm5fUopLo6Jyn0JVdiEyQ2YXIBJldiEyQ2YXIBJldiEyQ2YXIBJldiExoUSObixnBG5Xq/vnPf6Z6lMtmfPWrX6V6lC9u04Y/DVFelZWK/vjHP6ax+/fvp/oPf/hDqm/evJnqrP032zcBxO2eu3dPdkMDAMyZMyepRSOZX3zxRarfcccdVF+2bBnVWTtotjcB4O25zRqsbgWgK7sQ2SCzC5EJMrsQmSCzC5EJMrsQmSCzC5EJMrsQmRC2km5OBg0a5A8++GBS37p1K41n7Xlrangn6759+1I9GsHL6rqj0cJR2+K33nqL6hdffDHVWd518uTJNHbRokVUj9pks2MDoCO6x48fT2OjOv9oVDYbV71t2zYaG7Umj1pJR/s2Tp8+ndTWr19PY9naX3jhBRw4cKDBZLuu7EJkgswuRCbI7EJkgswuRCbI7EJkgswuRCbI7EJkQknr2Wtra8P6Z8bx48eTWrdu3WjsihUrqD5o0CCqDxs2LKmxEboAcOjQIao/9NBDVI/yzYyoFn7kyJFUj/qrjxgxguqsn39U5x+N8I6e8/feey+p9ezZk8ZGI77/8Y9/FBXP9rds2bKFxk6cODGpvfbaa0ktvLKbWX8zW2ZmVWa2zsy+W7i/h5ktNbNNhVveSUAIUVYa8za+DsB97n4ZgOsBfMfMRgL4HoCX3H04gJcK3wshWiih2d29xt1XFb4+CqAKQF8A0wDMLfzYXABfOl+LFEIUz7/0AZ2ZDQIwGsAKAL3dvQao/4UAoMGN62Y2y8wqzazy5MmTxa1WCNFkGm12M+sC4CkA97r7kcbGuftsd69w94qOHTs2ZY1CiGagUWY3s7aoN/o8d3+6cPceM+tT0PsA2Ht+liiEaA7C1JvV96b9LYAqd3/0HGkRgJkAHincPhMerE0b2iaXjR4GgJ07dya1w4cP09horDJLWQBA586dk9qCBQtoLBvPCwCrVq2ieqdOnah+8ODBpMbSlQCwbt06qketpqP4W2+9NalFacHWrVtTPXq9sLLlN954g8YOGDCA6lEr6t69e1P9mWfSdolKd9l5Yym9xuTZPw3gdgBrzOzDwusHUG/yP5jZHQC2AfhKIx5LCFEmQrO7+18BpDrPf6Z5lyOEOF9ou6wQmSCzC5EJMrsQmSCzC5EJMrsQmVDSEld3pyN8hwwZQuNra2uTWlRmumbNGqqzUbcAL1mMRjZH5ZBRSWM0ypq12N64cSONZeOegbgNdlTi+tRTTyW1aGRz1EI7ah/O2kHfdNNNRT121P47ek6HDh2a1KL9B6z0l+XZdWUXIhNkdiEyQWYXIhNkdiEyQWYXIhNkdiEyQWYXIhNKmmdv3749HeEbjQ++8sork9q1114bHpsRja5mNefjxo2jsdEegD59+lCd9QAAeB4+qoV/9dVXqR6NHr700kupzuq62dhiABg4cCDVq6urqc7q4aM21jfccAPVozbXUa6c7Y2Ixovv2LGjScfVlV2ITJDZhcgEmV2ITJDZhcgEmV2ITJDZhcgEmV2ITCj5yGZWXx312ma5zai/eTSNpnt3PoS2rq4uqbHRwECcw49qxp988kmqT5kyJalF/fBnzJhB9cWLF1M96kvPRnRv376dxkb9Db785S9TnfWVj/ZdvP/++1Q/evRoUfFf/OIXk9rSpUtp7IUXXpjUWF8GXdmFyASZXYhMkNmFyASZXYhMkNmFyASZXYhMkNmFyITGzGfvD+B3AC4GcBbAbHd/zMweBnAngH2FH33A3WlStnXr1nRmNqvTBYB+/foltU2bNtHYm2++merRnPHLL788qW3dupXGTp8+nep/+9vfqD5p0iSqsz0E3/jGN2js/PnzqR71bn/uueeo/uCDDya1JUuW0Fj2WgHiPQSsN3vUY4DFAvHrJTrvf/rTn5Jahw4daOyxY8eSGqtnb8ymmjoA97n7KjPrCuANM/sw6/9Td/+vRjyGEKLMNGY+ew2AmsLXR82sCgBvpSGEaHH8S3+zm9kgAKMBrCjcdY+ZrTazOWbW4H5TM5tlZpVmVsnefgghzi+NNruZdQHwFIB73f0IgF8BGArgatRf+X/SUJy7z3b3Cnev6NKlSzMsWQjRFBpldjNri3qjz3P3pwHA3fe4+xl3PwvgNwDGnL9lCiGKJTS71ZfR/BZAlbs/es7957ZEnQ6At+sUQpQVi8ovzWwsgFcBrEF96g0AHgAwA/Vv4R1ANYC7Ch/mJenbt6/ffffdST0qU2WloFHJISsLBOI0T1VVVVKLWv9GqZSePXtSfeHChVRnbbTZmGsgLvVko6qB4kpBx44dS2NZiSoAXHTRRVQ/c+ZMUoues6hsOXq9RS282ejy6DlhZeLPPvss9u/f32Cda2M+jf8rgIaCeaGzEKJFoR10QmSCzC5EJsjsQmSCzC5EJsjsQmSCzC5EJpS0lbSZ0bwta9cM8LzqkSNHaGyU627Viv/eGz9+fFJj45wBoH///lRno4WBeBw1ywmPGjWKxrJ8b2P0bt26UX3q1KlJ7eWXXy7qsaNcOBt1vW/fvqQG8HJqoD6fzYjKktnrLRoHPXr06KS2bNmy9DHpowoh/m2Q2YXIBJldiEyQ2YXIBJldiEyQ2YXIBJldiEwI69mb9WBm+wCc23e5JwBeEF0+WuraWuq6AK2tqTTn2ga6e4MbUkpq9n86uFmlu1eUbQGElrq2lrouQGtrKqVam97GC5EJMrsQmVBus88u8/EZLXVtLXVdgNbWVEqytrL+zS6EKB3lvrILIUqEzC5EJpTF7GY2xczeMbPNZva9cqwhhZlVm9kaM3vLzCrLvJY5ZrbXzNaec18PM1tqZpsKtw3O2CvT2h42s52Fc/eWmd1SprX1N7NlZlZlZuvM7LuF+8t67si6SnLeSv43u5m1BrARwM0AdgBYCWCGu68v6UISmFk1gAp3L/sGDDO7CcAxAL9z9ysK9/0ngAPu/kjhF2V3d7+/haztYQDHyj3GuzCtqM+5Y8YBfAnAf6CM546s6zaU4LyV48o+BsBmd9/i7qcALAAwrQzraPG4+ysADnzs7mkA5ha+nov6F0vJSaytReDuNe6+qvD1UQAfjhkv67kj6yoJ5TB7XwDbz/l+B1rWvHcHsMTM3jCzWeVeTAP0/nDMVuG2V5nX83HCMd6l5GNjxlvMuWvK+PNiKYfZGxol1ZLyf59292sATAXwncLbVdE4GjXGu1Q0MGa8RdDU8efFUg6z7wBwbgfGfgB2lWEdDeLuuwq3ewEsRMsbRb3nwwm6hdu9ZV7P/9GSxng3NGYcLeDclXP8eTnMvhLAcDMbbGbtAHwNwKIyrOOfMLPOhQ9OYGadAUxGyxtFvQjAzMLXMwE8U8a1fISWMsY7NWYcZT53ZR9/7u4l/wfgFtR/Iv8ugAfLsYbEuoYAeLvwb1251wZgPurf1p1G/TuiOwBcCOAlAJsKtz1a0NqeQP1o79WoN1afMq1tLOr/NFwN4K3Cv1vKfe7Iukpy3rRdVohM0A46ITJBZhciE2R2ITJBZhciE2R2ITJBZhciE2R2ITLhfwE9FH20LQ6mOgAAAABJRU5ErkJggg==\n",
|
| 321 |
-
"text/plain": [
|
| 322 |
-
"<Figure size 432x288 with 1 Axes>"
|
| 323 |
-
]
|
| 324 |
-
},
|
| 325 |
-
"metadata": {
|
| 326 |
-
"needs_background": "light"
|
| 327 |
-
},
|
| 328 |
-
"output_type": "display_data"
|
| 329 |
-
}
|
| 330 |
-
],
|
| 331 |
-
"source": [
|
| 332 |
-
"generator = make_generator_model()\n",
|
| 333 |
-
"\n",
|
| 334 |
-
"noise = tf.random.normal([1, 100])\n",
|
| 335 |
-
"generated_image = generator(noise, training=False)\n",
|
| 336 |
-
"\n",
|
| 337 |
-
"plt.imshow(generated_image[0, :, :, 0], cmap='gray')"
|
| 338 |
-
]
|
| 339 |
-
},
|
| 340 |
-
{
|
| 341 |
-
"cell_type": "markdown",
|
| 342 |
-
"metadata": {
|
| 343 |
-
"colab_type": "text",
|
| 344 |
-
"id": "D0IKnaCtg6WE"
|
| 345 |
-
},
|
| 346 |
-
"source": [
|
| 347 |
-
"### The Discriminator\n",
|
| 348 |
-
"\n",
|
| 349 |
-
"The discriminator is a CNN-based image classifier."
|
| 350 |
-
]
|
| 351 |
-
},
|
| 352 |
-
{
|
| 353 |
-
"cell_type": "code",
|
| 354 |
-
"execution_count": 16,
|
| 355 |
-
"metadata": {
|
| 356 |
-
"colab": {},
|
| 357 |
-
"colab_type": "code",
|
| 358 |
-
"id": "dw2tPLmk2pEP"
|
| 359 |
-
},
|
| 360 |
-
"outputs": [],
|
| 361 |
-
"source": [
|
| 362 |
-
"def make_discriminator_model():\n",
|
| 363 |
-
" model = tf.keras.Sequential()\n",
|
| 364 |
-
" model.add(layers.Conv2D(64, (5, 5), strides=(2, 2), padding='same',\n",
|
| 365 |
-
" input_shape=[28, 28, 1]))\n",
|
| 366 |
-
" model.add(layers.LeakyReLU())\n",
|
| 367 |
-
" model.add(layers.Dropout(0.3))\n",
|
| 368 |
-
"\n",
|
| 369 |
-
" model.add(layers.Conv2D(128, (5, 5), strides=(2, 2), padding='same'))\n",
|
| 370 |
-
" model.add(layers.LeakyReLU())\n",
|
| 371 |
-
" model.add(layers.Dropout(0.3))\n",
|
| 372 |
-
"\n",
|
| 373 |
-
" model.add(layers.Flatten())\n",
|
| 374 |
-
" model.add(layers.Dense(1))\n",
|
| 375 |
-
"\n",
|
| 376 |
-
" return model"
|
| 377 |
-
]
|
| 378 |
-
},
|
| 379 |
-
{
|
| 380 |
-
"cell_type": "markdown",
|
| 381 |
-
"metadata": {
|
| 382 |
-
"colab_type": "text",
|
| 383 |
-
"id": "QhPneagzCaQv"
|
| 384 |
-
},
|
| 385 |
-
"source": [
|
| 386 |
-
"Use the (as yet untrained) discriminator to classify the generated images as real or fake. The model will be trained to output positive values for real images, and negative values for fake images."
|
| 387 |
-
]
|
| 388 |
-
},
|
| 389 |
-
{
|
| 390 |
-
"cell_type": "code",
|
| 391 |
-
"execution_count": 17,
|
| 392 |
-
"metadata": {
|
| 393 |
-
"colab": {},
|
| 394 |
-
"colab_type": "code",
|
| 395 |
-
"id": "gDkA05NE6QMs"
|
| 396 |
-
},
|
| 397 |
-
"outputs": [
|
| 398 |
-
{
|
| 399 |
-
"name": "stdout",
|
| 400 |
-
"output_type": "stream",
|
| 401 |
-
"text": [
|
| 402 |
-
"tf.Tensor([[0.00099409]], shape=(1, 1), dtype=float32)\n"
|
| 403 |
-
]
|
| 404 |
-
}
|
| 405 |
-
],
|
| 406 |
-
"source": [
|
| 407 |
-
"discriminator = make_discriminator_model()\n",
|
| 408 |
-
"decision = discriminator(generated_image)\n",
|
| 409 |
-
"print (decision)"
|
| 410 |
-
]
|
| 411 |
-
},
|
| 412 |
-
{
|
| 413 |
-
"cell_type": "markdown",
|
| 414 |
-
"metadata": {
|
| 415 |
-
"colab_type": "text",
|
| 416 |
-
"id": "0FMYgY_mPfTi"
|
| 417 |
-
},
|
| 418 |
-
"source": [
|
| 419 |
-
"## Define the loss and optimizers\n",
|
| 420 |
-
"\n",
|
| 421 |
-
"Define loss functions and optimizers for both models.\n"
|
| 422 |
-
]
|
| 423 |
-
},
|
| 424 |
-
{
|
| 425 |
-
"cell_type": "code",
|
| 426 |
-
"execution_count": 18,
|
| 427 |
-
"metadata": {
|
| 428 |
-
"colab": {},
|
| 429 |
-
"colab_type": "code",
|
| 430 |
-
"id": "psQfmXxYKU3X"
|
| 431 |
-
},
|
| 432 |
-
"outputs": [],
|
| 433 |
-
"source": [
|
| 434 |
-
"# This method returns a helper function to compute cross entropy loss\n",
|
| 435 |
-
"cross_entropy = tf.keras.losses.BinaryCrossentropy(from_logits=True)"
|
| 436 |
-
]
|
| 437 |
-
},
|
| 438 |
-
{
|
| 439 |
-
"cell_type": "markdown",
|
| 440 |
-
"metadata": {
|
| 441 |
-
"colab_type": "text",
|
| 442 |
-
"id": "PKY_iPSPNWoj"
|
| 443 |
-
},
|
| 444 |
-
"source": [
|
| 445 |
-
"### Discriminator loss\n",
|
| 446 |
-
"\n",
|
| 447 |
-
"This method quantifies how well the discriminator is able to distinguish real images from fakes. It compares the discriminator's predictions on real images to an array of 1s, and the discriminator's predictions on fake (generated) images to an array of 0s."
|
| 448 |
-
]
|
| 449 |
-
},
|
| 450 |
-
{
|
| 451 |
-
"cell_type": "code",
|
| 452 |
-
"execution_count": 19,
|
| 453 |
-
"metadata": {
|
| 454 |
-
"colab": {},
|
| 455 |
-
"colab_type": "code",
|
| 456 |
-
"id": "wkMNfBWlT-PV"
|
| 457 |
-
},
|
| 458 |
-
"outputs": [],
|
| 459 |
-
"source": [
|
| 460 |
-
"def discriminator_loss(real_output, fake_output):\n",
|
| 461 |
-
" real_loss = cross_entropy(tf.ones_like(real_output), real_output)\n",
|
| 462 |
-
" fake_loss = cross_entropy(tf.zeros_like(fake_output), fake_output)\n",
|
| 463 |
-
" total_loss = real_loss + fake_loss\n",
|
| 464 |
-
" return total_loss"
|
| 465 |
-
]
|
| 466 |
-
},
|
| 467 |
-
{
|
| 468 |
-
"cell_type": "markdown",
|
| 469 |
-
"metadata": {
|
| 470 |
-
"colab_type": "text",
|
| 471 |
-
"id": "Jd-3GCUEiKtv"
|
| 472 |
-
},
|
| 473 |
-
"source": [
|
| 474 |
-
"### Generator loss\n",
|
| 475 |
-
"The generator's loss quantifies how well it was able to trick the discriminator. Intuitively, if the generator is performing well, the discriminator will classify the fake images as real (or 1). Here, we will compare the discriminators decisions on the generated images to an array of 1s."
|
| 476 |
-
]
|
| 477 |
-
},
|
| 478 |
-
{
|
| 479 |
-
"cell_type": "code",
|
| 480 |
-
"execution_count": 20,
|
| 481 |
-
"metadata": {
|
| 482 |
-
"colab": {},
|
| 483 |
-
"colab_type": "code",
|
| 484 |
-
"id": "90BIcCKcDMxz"
|
| 485 |
-
},
|
| 486 |
-
"outputs": [],
|
| 487 |
-
"source": [
|
| 488 |
-
"def generator_loss(fake_output):\n",
|
| 489 |
-
" return cross_entropy(tf.ones_like(fake_output), fake_output)"
|
| 490 |
-
]
|
| 491 |
-
},
|
| 492 |
-
{
|
| 493 |
-
"cell_type": "markdown",
|
| 494 |
-
"metadata": {
|
| 495 |
-
"colab_type": "text",
|
| 496 |
-
"id": "MgIc7i0th_Iu"
|
| 497 |
-
},
|
| 498 |
-
"source": [
|
| 499 |
-
"The discriminator and the generator optimizers are different since we will train two networks separately."
|
| 500 |
-
]
|
| 501 |
-
},
|
| 502 |
-
{
|
| 503 |
-
"cell_type": "code",
|
| 504 |
-
"execution_count": 21,
|
| 505 |
-
"metadata": {
|
| 506 |
-
"colab": {},
|
| 507 |
-
"colab_type": "code",
|
| 508 |
-
"id": "iWCn_PVdEJZ7"
|
| 509 |
-
},
|
| 510 |
-
"outputs": [],
|
| 511 |
-
"source": [
|
| 512 |
-
"generator_optimizer = tf.keras.optimizers.Adam(1e-4)\n",
|
| 513 |
-
"discriminator_optimizer = tf.keras.optimizers.Adam(1e-4)"
|
| 514 |
-
]
|
| 515 |
-
},
|
| 516 |
-
{
|
| 517 |
-
"cell_type": "markdown",
|
| 518 |
-
"metadata": {
|
| 519 |
-
"colab_type": "text",
|
| 520 |
-
"id": "mWtinsGDPJlV"
|
| 521 |
-
},
|
| 522 |
-
"source": [
|
| 523 |
-
"### Save checkpoints\n",
|
| 524 |
-
"This notebook also demonstrates how to save and restore models, which can be helpful in case a long running training task is interrupted."
|
| 525 |
-
]
|
| 526 |
-
},
|
| 527 |
-
{
|
| 528 |
-
"cell_type": "code",
|
| 529 |
-
"execution_count": 22,
|
| 530 |
-
"metadata": {
|
| 531 |
-
"colab": {},
|
| 532 |
-
"colab_type": "code",
|
| 533 |
-
"id": "CA1w-7s2POEy"
|
| 534 |
-
},
|
| 535 |
-
"outputs": [],
|
| 536 |
-
"source": [
|
| 537 |
-
"checkpoint_dir = './training_checkpoints'\n",
|
| 538 |
-
"checkpoint_prefix = os.path.join(checkpoint_dir, \"ckpt\")\n",
|
| 539 |
-
"checkpoint = tf.train.Checkpoint(generator_optimizer=generator_optimizer,\n",
|
| 540 |
-
" discriminator_optimizer=discriminator_optimizer,\n",
|
| 541 |
-
" generator=generator,\n",
|
| 542 |
-
" discriminator=discriminator)"
|
| 543 |
-
]
|
| 544 |
-
},
|
| 545 |
-
{
|
| 546 |
-
"cell_type": "markdown",
|
| 547 |
-
"metadata": {
|
| 548 |
-
"colab_type": "text",
|
| 549 |
-
"id": "Rw1fkAczTQYh"
|
| 550 |
-
},
|
| 551 |
-
"source": [
|
| 552 |
-
"## Define the training loop\n"
|
| 553 |
-
]
|
| 554 |
-
},
|
| 555 |
-
{
|
| 556 |
-
"cell_type": "code",
|
| 557 |
-
"execution_count": 23,
|
| 558 |
-
"metadata": {
|
| 559 |
-
"colab": {},
|
| 560 |
-
"colab_type": "code",
|
| 561 |
-
"id": "NS2GWywBbAWo"
|
| 562 |
-
},
|
| 563 |
-
"outputs": [],
|
| 564 |
-
"source": [
|
| 565 |
-
"EPOCHS = 50\n",
|
| 566 |
-
"noise_dim = 100\n",
|
| 567 |
-
"num_examples_to_generate = 16\n",
|
| 568 |
-
"\n",
|
| 569 |
-
"# We will reuse this seed overtime (so it's easier)\n",
|
| 570 |
-
"# to visualize progress in the animated GIF)\n",
|
| 571 |
-
"seed = tf.random.normal([num_examples_to_generate, noise_dim])"
|
| 572 |
-
]
|
| 573 |
-
},
|
| 574 |
-
{
|
| 575 |
-
"cell_type": "markdown",
|
| 576 |
-
"metadata": {
|
| 577 |
-
"colab_type": "text",
|
| 578 |
-
"id": "jylSonrqSWfi"
|
| 579 |
-
},
|
| 580 |
-
"source": [
|
| 581 |
-
"The training loop begins with generator receiving a random seed as input. That seed is used to produce an image. The discriminator is then used to classify real images (drawn from the training set) and fakes images (produced by the generator). The loss is calculated for each of these models, and the gradients are used to update the generator and discriminator."
|
| 582 |
-
]
|
| 583 |
-
},
|
| 584 |
-
{
|
| 585 |
-
"cell_type": "code",
|
| 586 |
-
"execution_count": 24,
|
| 587 |
-
"metadata": {
|
| 588 |
-
"colab": {},
|
| 589 |
-
"colab_type": "code",
|
| 590 |
-
"id": "3t5ibNo05jCB"
|
| 591 |
-
},
|
| 592 |
-
"outputs": [],
|
| 593 |
-
"source": [
|
| 594 |
-
"# Notice the use of `tf.function`\n",
|
| 595 |
-
"# This annotation causes the function to be \"compiled\".\n",
|
| 596 |
-
"@tf.function\n",
|
| 597 |
-
"def train_step(images):\n",
|
| 598 |
-
" noise = tf.random.normal([BATCH_SIZE, noise_dim])\n",
|
| 599 |
-
"\n",
|
| 600 |
-
" with tf.GradientTape() as gen_tape, tf.GradientTape() as disc_tape:\n",
|
| 601 |
-
" generated_images = generator(noise, training=True)\n",
|
| 602 |
-
"\n",
|
| 603 |
-
" real_output = discriminator(images, training=True)\n",
|
| 604 |
-
" fake_output = discriminator(generated_images, training=True)\n",
|
| 605 |
-
"\n",
|
| 606 |
-
" gen_loss = generator_loss(fake_output)\n",
|
| 607 |
-
" disc_loss = discriminator_loss(real_output, fake_output)\n",
|
| 608 |
-
"\n",
|
| 609 |
-
" gradients_of_generator = gen_tape.gradient(gen_loss, generator.trainable_variables)\n",
|
| 610 |
-
" gradients_of_discriminator = disc_tape.gradient(disc_loss, discriminator.trainable_variables)\n",
|
| 611 |
-
"\n",
|
| 612 |
-
" generator_optimizer.apply_gradients(zip(gradients_of_generator, generator.trainable_variables))\n",
|
| 613 |
-
" discriminator_optimizer.apply_gradients(zip(gradients_of_discriminator, discriminator.trainable_variables))"
|
| 614 |
-
]
|
| 615 |
-
},
|
| 616 |
-
{
|
| 617 |
-
"cell_type": "code",
|
| 618 |
-
"execution_count": 25,
|
| 619 |
-
"metadata": {
|
| 620 |
-
"colab": {},
|
| 621 |
-
"colab_type": "code",
|
| 622 |
-
"id": "2M7LmLtGEMQJ"
|
| 623 |
-
},
|
| 624 |
-
"outputs": [],
|
| 625 |
-
"source": [
|
| 626 |
-
"def train(dataset, epochs):\n",
|
| 627 |
-
" for epoch in range(epochs):\n",
|
| 628 |
-
" start = time.time()\n",
|
| 629 |
-
"\n",
|
| 630 |
-
" for image_batch in dataset:\n",
|
| 631 |
-
" train_step(image_batch)\n",
|
| 632 |
-
"\n",
|
| 633 |
-
" # Produce images for the GIF as we go\n",
|
| 634 |
-
" display.clear_output(wait=True)\n",
|
| 635 |
-
" generate_and_save_images(generator,\n",
|
| 636 |
-
" epoch + 1,\n",
|
| 637 |
-
" seed)\n",
|
| 638 |
-
"\n",
|
| 639 |
-
" # Save the model every 15 epochs\n",
|
| 640 |
-
" if (epoch + 1) % 15 == 0:\n",
|
| 641 |
-
" checkpoint.save(file_prefix = checkpoint_prefix)\n",
|
| 642 |
-
"\n",
|
| 643 |
-
" print ('Time for epoch {} is {} sec'.format(epoch + 1, time.time()-start))\n",
|
| 644 |
-
"\n",
|
| 645 |
-
" # Generate after the final epoch\n",
|
| 646 |
-
" display.clear_output(wait=True)\n",
|
| 647 |
-
" generate_and_save_images(generator,\n",
|
| 648 |
-
" epochs,\n",
|
| 649 |
-
" seed)"
|
| 650 |
-
]
|
| 651 |
-
},
|
| 652 |
-
{
|
| 653 |
-
"cell_type": "markdown",
|
| 654 |
-
"metadata": {
|
| 655 |
-
"colab_type": "text",
|
| 656 |
-
"id": "2aFF7Hk3XdeW"
|
| 657 |
-
},
|
| 658 |
-
"source": [
|
| 659 |
-
"**Generate and save images**\n"
|
| 660 |
-
]
|
| 661 |
-
},
|
| 662 |
-
{
|
| 663 |
-
"cell_type": "code",
|
| 664 |
-
"execution_count": 26,
|
| 665 |
-
"metadata": {
|
| 666 |
-
"colab": {},
|
| 667 |
-
"colab_type": "code",
|
| 668 |
-
"id": "RmdVsmvhPxyy"
|
| 669 |
-
},
|
| 670 |
-
"outputs": [],
|
| 671 |
-
"source": [
|
| 672 |
-
"def generate_and_save_images(model, epoch, test_input):\n",
|
| 673 |
-
" # Notice `training` is set to False.\n",
|
| 674 |
-
" # This is so all layers run in inference mode (batchnorm).\n",
|
| 675 |
-
" predictions = model(test_input, training=False)\n",
|
| 676 |
-
"\n",
|
| 677 |
-
" fig = plt.figure(figsize=(4,4))\n",
|
| 678 |
-
"\n",
|
| 679 |
-
" for i in range(predictions.shape[0]):\n",
|
| 680 |
-
" plt.subplot(4, 4, i+1)\n",
|
| 681 |
-
" plt.imshow(predictions[i, :, :, 0] * 127.5 + 127.5, cmap='gray')\n",
|
| 682 |
-
" plt.axis('off')\n",
|
| 683 |
-
"\n",
|
| 684 |
-
" plt.savefig('image_at_epoch_{:04d}.png'.format(epoch))\n",
|
| 685 |
-
" plt.show()"
|
| 686 |
-
]
|
| 687 |
-
},
|
| 688 |
-
{
|
| 689 |
-
"cell_type": "markdown",
|
| 690 |
-
"metadata": {
|
| 691 |
-
"colab_type": "text",
|
| 692 |
-
"id": "dZrd4CdjR-Fp"
|
| 693 |
-
},
|
| 694 |
-
"source": [
|
| 695 |
-
"## Train the model\n",
|
| 696 |
-
"Call the `train()` method defined above to train the generator and discriminator simultaneously. Note, training GANs can be tricky. It's important that the generator and discriminator do not overpower each other (e.g., that they train at a similar rate).\n",
|
| 697 |
-
"\n",
|
| 698 |
-
"At the beginning of the training, the generated images look like random noise. As training progresses, the generated digits will look increasingly real. After about 50 epochs, they resemble MNIST digits. This may take about one minute / epoch with the default settings on Colab."
|
| 699 |
-
]
|
| 700 |
-
},
|
| 701 |
-
{
|
| 702 |
-
"cell_type": "code",
|
| 703 |
-
"execution_count": null,
|
| 704 |
-
"metadata": {
|
| 705 |
-
"colab": {},
|
| 706 |
-
"colab_type": "code",
|
| 707 |
-
"id": "Ly3UN0SLLY2l"
|
| 708 |
-
},
|
| 709 |
-
"outputs": [],
|
| 710 |
-
"source": [
|
| 711 |
-
"train(train_dataset, EPOCHS)"
|
| 712 |
-
]
|
| 713 |
-
},
|
| 714 |
-
{
|
| 715 |
-
"cell_type": "markdown",
|
| 716 |
-
"metadata": {
|
| 717 |
-
"colab_type": "text",
|
| 718 |
-
"id": "rfM4YcPVPkNO"
|
| 719 |
-
},
|
| 720 |
-
"source": [
|
| 721 |
-
"Restore the latest checkpoint."
|
| 722 |
-
]
|
| 723 |
-
},
|
| 724 |
-
{
|
| 725 |
-
"cell_type": "code",
|
| 726 |
-
"execution_count": 24,
|
| 727 |
-
"metadata": {
|
| 728 |
-
"colab": {},
|
| 729 |
-
"colab_type": "code",
|
| 730 |
-
"id": "XhXsd0srPo8c"
|
| 731 |
-
},
|
| 732 |
-
"outputs": [
|
| 733 |
-
{
|
| 734 |
-
"data": {
|
| 735 |
-
"text/plain": [
|
| 736 |
-
"<tensorflow.python.training.tracking.util.CheckpointLoadStatus at 0x7f89c41bfba8>"
|
| 737 |
-
]
|
| 738 |
-
},
|
| 739 |
-
"execution_count": 24,
|
| 740 |
-
"metadata": {},
|
| 741 |
-
"output_type": "execute_result"
|
| 742 |
-
}
|
| 743 |
-
],
|
| 744 |
-
"source": [
|
| 745 |
-
"checkpoint.restore(tf.train.latest_checkpoint(checkpoint_dir))"
|
| 746 |
-
]
|
| 747 |
-
},
|
| 748 |
-
{
|
| 749 |
-
"cell_type": "markdown",
|
| 750 |
-
"metadata": {
|
| 751 |
-
"colab_type": "text",
|
| 752 |
-
"id": "P4M_vIbUi7c0"
|
| 753 |
-
},
|
| 754 |
-
"source": [
|
| 755 |
-
"## Create a GIF\n"
|
| 756 |
-
]
|
| 757 |
-
},
|
| 758 |
-
{
|
| 759 |
-
"cell_type": "code",
|
| 760 |
-
"execution_count": 25,
|
| 761 |
-
"metadata": {
|
| 762 |
-
"colab": {},
|
| 763 |
-
"colab_type": "code",
|
| 764 |
-
"id": "WfO5wCdclHGL"
|
| 765 |
-
},
|
| 766 |
-
"outputs": [],
|
| 767 |
-
"source": [
|
| 768 |
-
"# Display a single image using the epoch number\n",
|
| 769 |
-
"def display_image(epoch_no):\n",
|
| 770 |
-
" return PIL.Image.open('image_at_epoch_{:04d}.png'.format(epoch_no))"
|
| 771 |
-
]
|
| 772 |
-
},
|
| 773 |
-
{
|
| 774 |
-
"cell_type": "code",
|
| 775 |
-
"execution_count": 26,
|
| 776 |
-
"metadata": {
|
| 777 |
-
"colab": {},
|
| 778 |
-
"colab_type": "code",
|
| 779 |
-
"id": "5x3q9_Oe5q0A"
|
| 780 |
-
},
|
| 781 |
-
"outputs": [
|
| 782 |
-
{
|
| 783 |
-
"data": {
|
| 784 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAASAAAAEgCAYAAAAUg66AAABrWklEQVR4nO29d5Sc13nf/5ne+2yd7QXbUBaVIMAGFhWr0KRMSy6MZFuynVixkzhRbMs55xfHJ5aSuBwX+ViObMe2JFu2GiWSEgmCBEmAJIi+WJQt2DqzZXZ2em+/P3Du5S4IkiAJYGYW7+ccHIKLBfa9877v9z73qapyuVxGQUFBoQKoK30BCgoKty+KACkoKFQMRYAUFBQqhiJACgoKFUMRIAUFhYqhCJCCgkLFUARIQUGhYigCpKCgUDEUAVJQUKgYigApKChUDEWAFBQUKoYiQAoKChVDESAFBYWKoQiQgoJCxVAESEFBoWIoAqSgoFAxFAFSUFCoGIoAKSgoVAxFgBQUFCqGIkAKCgoVQxEgBQWFiqEIkIKCQsVQBEhBQaFiKAKkoKBQMRQBUlBQqBiKACkoKFQMRYAUFBQqhiJACgoKFUMRIAUFhYqhCJCCgkLFUARIQUGhYigCpKCgUDEUAVJQUKgYigApKChUDG2lL+BaqNVqyuVypS/jPXO9165SqW7yldxclHWuR61Wv6vvrzYqcd1VKUC1egMVbm+U5/bdU5UCpPDuUavVqNVqNBoN5XKZcrlMoVBQXgqFqkYRoBpGpVJJ4dHr9eh0OvR6PaVSiWKxSCqVolgsUigUKn2pCteBOKreTpvGhhSgtT6Hd3szxTle/N1qfBhUKhVGoxGj0cjQ0BCtra188IMfxO124/F4KJVKpNNpfvzjH3Pp0iWef/55crkc+Xy+0peucA3UajU6nQ6tVkupVKJUKpHL5ary2bvRbEgBWotKpXrbGyle5ObmZvR6PVqtlkQiQSwWY3V1lUwmcwuv9u3RarWYzWbq6+txOp04HA42b95Ma2sr27dvx+Vy4XK5AEilUgSDQSwWC+FwmKWlJcLhMNFoVLGIqghxbNZqtWi12tvu2KwqV+Fqb0TURFgyIqImlil+L35GX18fvb29/O7v/i4dHR0YDAaOHTvGM888w/e+9z0mJyff9QNxM6JDGo0Gp9PJ9u3b+dznPsfAwADt7e3o9Xo0Gg0ajQaVSrXOjC8UCmSzWWKxGD/4wQ84fPgwP/7xj1ldXX1X63krlCjYet7NOsXxWavVyuOzRqMhm82Sz+dJp9Pv9XLfM0oU7AYiPsxSqST/X9xwu92Ox+Ohq6uLLVu20NvbS3NzMw6HA41GQ319Pd3d3VgslkouYR1qtVpaODabDZPJhFarlRaeOF6JdQox0ul0WK1Wmpub6e3t5aWXXiIajVIsFiu8otsXcW/E/SsUCpRKJVQqFfl8Xj6ztwMbXoDWqrpw1jY3NzMwMMDDDz/M9u3b6enpkRYEgNvtZtOmTVgslqoxh9VqNfX19Xg8HoxGoxQdITzCd1Aul9FqtWg0GnQ6HXDl6CbWbLFY0Gg0G1aAasWRKzYH4e8R9+92Y8MK0LUQ5+uZmRlyuRxqtRqbzUZ7e/u6JLLl5WVOnTpFNBqt8BW/QT6f58KFCwQCAUZGRjCbzRgMBvmiCV+CRqPB5/Ph8XjYt28fNpsNh8PB6OgoIyMjJBKJDSc+arWapqYmdu/ejcvlwmg08u1vf5tgMFiVQlQulymVStK/WCwWq/I6bwW3lQAJv0gsFkOtVjM+Pk4oFCKfz6PX6+VxJpFIsLi4WFUO6FKpRDgcJhwOMzs7K814YboLR6ZOp6Onpwefz0dHRwcul4tcLoff72dmZoZMJrNhHnaxZqfTSXt7O9u3b6e+vh6TycSzzz5LOByu2shfuVyWG8FGuR/vhdtKgARCZCYnJ/H7/QSDQZlDUygUSKVSUpjeKYpWCa5lqot8n1wuRzweJx6Po1ariUQijIyM8OKLL3L69GlisdgtN/VvxmeoVqtpaWmhpaWFX//1X6etrY3u7m7K5TLpdJrvfve7lEolpqamqu7+Car1um4lt6UAwZXd02AwoNfr0ev1645ga5P3qlGA3gph2sMbuSWpVIpAIEAoFKrY8etan584MlqtVkwmE2azmXQ6TT6fJ5fLye8zmUwYDAaZ4Z3L5bDb7bhcLrq7u/H5fPT29uL1erFYLORyOXK5HG63G7fbzczMTE0fOUW0TDyHV3+WBoMBnU6HzWaTX8tkMuTzeVKpVNX7lm5LAVKpVBgMBhoaGvB6vbhcLhmREE7MQqEgb34138C1iPQC4Wy3WCxEIhHOnz/P4uIiyWSy0pco0Wq1WCwWNm3aRHNzMx0dHfj9fmKxGCsrK3ItLS0t8lhVKpVYXV2lv7+fPXv20NfXh9vtRqfTyeN1KpUikUjQ0tJCLBZjZGRknTBXM1eH8cXzKAS4WCxKQRH3WeSE9fX1yU10fn6eSCTC7Ows2WyWbDZbieVcF7edAIlwdm9vLw8//DADAwPS/1MqlUgmk/KGiUhSte8iAvHAiuRKg8GAxWLB6/UyNzdX6cuTiJfK5XIxPDxMW1sbnZ2dbN26VR4j4cq9stvtmM1mWV6STCbxer00NDSg1+spFovSYk2n04RCIUKhEPF4nEwms25TqVZUKhV2ux2LxUJTUxNWq5X6+nrsdjtOpxOn07ku8plOp4nFYqRSKQBsNhvbtm3DbDZjMpmIx+OEw2F++MMfMjc3x+joaNVa8beVAIncC6/XS29vLw8++CCtra3S+imVSqRSKem4FMcYcRyrdhESu6LRaMRqtWIwGDCbzbhcLgwGQ6Uvbx0iq7utrY2Ojg66urowGo3SF7c2qqdWq4nH4+TzebLZLAaDAZPJJPNmxLEtHo+zsrJCMBhcJ0DVjLhnDoeD+vp6Nm/ejNfrpbu7m/r6epqbm6UFWC6XyWazRCIRFhcXCYVCRCIRLBYLO3fulKU4uVyOcDjM5cuXKRaLnD9/XhGgasBut9PY2Mjv/u7v0tXVRXd3NyaTCbhy5Eomk0xPTxMIBIjFYrjdboxGI6VSiUQiQSAQqOqQqU6nw2Qycccdd7Bz5056enpwu92EQiHGx8er5kEsl8vE43FmZmb4+te/TnNzM+3t7XR0dOD1etm2bRtWqxW73U4kEiEej/Pqq69K68blctHQ0EBbWxsWi4VYLEYsFsPv90sH/OnTp1lZWZE5NtWKVqvFarXy+OOP8/DDD+NyuWRWtMhkn56eJhwOMzExQTKZJBaLEQgEWFlZQaPR4PF4KJfL9PT0YDAYSKfTrKys8PrrrzMzM1Pd66/0BdwqVCoVnZ2d9Pb2MjAwQENDA0ajEY1GA7zhfM7lcuh0OlwuF2azWeYOBYNBotEo6XR6nZO0WlCpVFgsFurr62lvb6e9vR2r1Uomk8FisaDX6yt9iesoFouk02n8fj/ZbJZMJiOPWJ2dnajVagwGAysrKywtLXHx4kWCwSCrq6u4XC5CoRBwJWm0UCgQj8eJRqPEYjEpQslkUiZnVisGg4H6+no6Ojro7++XG2Iul2NhYQG/38/58+cJBoOMj4+TyWRIJpMsLy8Ti8Ww2+3odDry+bz0EYkj2urqKrFYrMIrfHtuCwESR6n/8B/+Aw888AB2ux21Wk2xWJSOOyE0ZrOZgYEBfD4fNptN+oBGR0fJZrPMz8+zvLxcVbuKyAPq7e3lwIEDHDhwgL6+PrTaK7c3m81WXSRIOI2j0SjJZJJAIMDi4iKNjY20tbXh9XpJJpMcOXKEs2fPcuTIESKRCIVCAb1ej9FoJBwO09XVxdatW3E4HCSTSSlkovq/msUHwOv1cv/999PT04PVakWlUkmf1muvvcbf/d3fcf78eUKh0Loi4nK5jEajYevWrTidTrq6uqirq0Or1bK0tMTU1BShUKiqAg/X4rYQoIGBAe688076+vpwOBwypCkEaG3kyO12Y7FYpONPhH/z+TwPPfQQr776KoDME6o0JpMJp9PJ8PAww8PD3HnnnXi9XrRaLfl8nkgkwoULFwiHw+8rpeBmpiMIcUwkEkQiEZaXl+VL6Pf7mZubI5lMyuOUiGrNzMxQLpfZvHkzRqMRt9vN0tIS6XSaTCZTlZbq1Xi9Xvbt20dTU5P0VxWLRWKxGKFQiEAgQDweJ5vNvqmsSK1Ws2vXLgYHB2ltbcVqtaLRaFheXmZubq7qj59wmwjQ7t27+cIXvkBjY6PMNxE1OGudnUajkfr6eplvsfaG22w26urqKJVKxONxEolExVsnqFQqWUry8z//8/T19bF58+Z1ls/S0hLHjx9neXn5Pf+M99Nf6XoQ1lA8HkelUrGwsEA2myWdTjMzM8PU1BTpdFoKVaFQoFAoMDExQSKR4OGHH8ZsNqPX6xkbGyOZTJJOp295+PndirRKpaKuro4PfOAD6/J4isUi4XCY5eVl/H4/6XT6mv+uTqfjgQceYHh4mPb2dvk5+v1+xsfHFQGqNCaTie7ubjo7O/F4PKhUKumgA9Dr9bLxk2jbIXbefD4vndHBYFCKTS6Xo6Ghgenpadk6oVJoNBr279/Pli1buOOOO3A6nessn6985StcvHiRixcvEovF3rN43CqRFaUvZ86ckX6RdDqNTqeTQiiuRaVSyQRGg8GAzWbDYrHg8/lYXl6W+T+3musVIZHDU19fj9VqlYXDxWKRaDTKa6+9xuTk5FtWxzc0NODz+Whvb6ehoQG1Wi0tyEuXLnH+/PmqsNDfiQ0rQOIBFa02TCaTzBnJZrPy2CUSDtea9iIXZXZ2lmAwKHNoTCYT6XRa9uBZ2z2xUmv0er00NjZSX18vQ+25XI5oNMqxY8ekL+BG+IBu9gst7s3i4iKFQgGTySR9GEKA1v4S6QY6nQ6j0YjD4cDr9VJfX7+uVUk1otVqaWxspK6ubl0mfrFYJJPJEAgEiEQi17xvKpWK+vp6enp6ZLCkVCoRi8WYmZkhEAgQDAarzu93LTasAFmtVgYGBvjSl76E1+tdZ+04HA7ZqGtsbIyZmRkmJycpFArU1dWRy+VIp9OcOHFCFqWKdHeRlVoqlWQErVIUi0VOnDhBPp/nIx/5CFarFaPRyMzMDJcvX2Z8fJzl5eX35YwVGcm36kUuFArMzc0RCAQ4f/78uiiWaNxlMBgwGo1s2rSJ9vZ23G43LpeL+vp67r33XoaGhnj22WdZWVm5pUeQdzO+x+l08oUvfIH+/n75HInjfSQSIRKJXLNwWJQQffKTn+Txxx+nvr6eYrFIKBTiX//1X/njP/5j2clTEaAKodVq2b9/P9u3b6exsRGj0SgtHo1Gg16vJ5PJEIlEmJycZGRkhPn5eUqlkrQWCoWCbGNaKpUwm81otVqMRqOsHxOWVKV22XK5TCQSYWFhgdHRUVpaWmhtbZVNyFpaWmR29/v9ObcSYakKH51IINXpdLjdbpxOJ16vl66uLnw+n7RItVotNpuNcrmM3W6XWcHVhEqlkkmHra2teL1e4I06vkQiQTQaZXV1lXg8/qY0grq6Onbt2sXAwICMeoXDYV544QVOnz7N0tKSbHBWC2w4AVKr1ZjNZv7bf/tvDA8Py7wKQDbqAlhdXWV6eprDhw/z/PPPk8/n0Wg0OBwOudOurq6SzWbR6/WYzWYZaTAYDAQCgaqosREpAd/85je5++67cblceL1ezGYzH/zgBzl16hSzs7NvKSLVfEyBN7LPhfh3dXXR29vL5s2bZRdL0R1SiJRGo6Gurg63200ikaia9YlNsL29nb6+Ptrb26mrqwOQR//l5WVmZ2e5dOkSy8vL6wIdKpWKLVu28Id/+Ic0NDTIflBzc3P89m//NqurqzUR+VvLhhOgRx55hPvvv5/Ozs51PX7W+nei0Sjz8/McO3aMxcVFmVOh1+uxWq1YLBasVqsccTM0NERbWxu7du0il8uRSqU4f/58xZO8RHpALBZjdHQUr9cre1w7HA4OHDiARqPhRz/60Vs6zKvl5XwnCoUCiUSC8fFxgsEgY2NjNDU14Xa72bp1K+3t7Xg8HlnK8cgjj9DR0cFf/uVfVs1LKZ4xl8uF2+1mdXWVcrmM0WhkZWWFUCjEk08+yeXLl5mdnSUej68TH+HzWluaMTExwdjYGNFotKr6V10vG0aARLLhvffey2c/+1kZObnauZzP54nH4ywvLzM2NkY8Hpe7ptFoxOl0yiJA4dTds2cPPT097N+/n2AwyNLSEhaLpSocnWtLSLq7u1lcXKSzsxOLxcLw8DCLi4uYTCa59lpFZE7Pz8/j9/tRqVQ0NTXJ6CbArl27pBV09913Y7Va+Zu/+Rt5pKsk4hhpMBhkge3KygrZbBaj0cj09DSzs7McPnyY2dlZlpaW1h2jRMqFmIYCV+791NQUk5OT62oYa4kNI0CbNm3iU5/6FHv27JHtGbLZLIFAgHA4zMrKCmazGY1GI0Obg4OD9Pf3o1Kp6OrqwuFw0NTUJHdR8dCIHcdms8l097VlHJVEWHei2X5LSwtms1kmqokEvXw+X5M75LUQa15eXpajhkKhEHv37qW5uRmXy0VTUxOpVIoPf/jDnD9/nnPnzlXsejUajexK0NjYSDabZWxsjJMnT8pgiEicDIVC5HK5Nwmm0+nkf/7P/8nQ0BAqlYpAIIDf7+d//a//xcWLF2tSfGCDCJBGo8HtdjM8PIzH46FQKMgw5sTEBLFYjHA4TGdnJ06nE51Oh91up6urSzqUu7q65LQMEWIXPiOTybSuYZkQsGpw9IkcJpvNJht16fV6yuUymUxGRkNq5aj1bsjn89JvMjc3x/nz59Hr9TidTvnfLVu2EI1GK9aSQjSfd7vd1NfX4/P55Ny5y5cvk0wmiUajb1uz5nK5aG5ulpn858+flxbT+Pg4fr//Fq/qxlHzAiR6xrS3t3P//fdTLpcJh8P82Z/9GefOneP111+X5Qqf//znaWlpobOzE61Wy759+6SlII5sV88RA+RRLp/Ps7S0xMTEhGz5UOkXu1wuYzKZ2Lp1K5s3b6arq0sOuLt8+TKTk5M13xXw7RAtWE+dOsUXv/hFfuu3fouWlhZMJhMNDQ380i/9EgAvvPBCRUbeiKPT3r17Zd+jZ555RubqvF22snge7733XrZu3UpDQwOjo6N8+ctfljWJtWr5CGpegIxGIw8++CB79uxBr9eTSCSIx+NMT08zMzNDOp3G5XLh8/lkwp7RaJTlCsJ/IHYgYS2IPB/x5yJsPz4+zsjICLFYrGpuvtlsllEhsa5cLse5c+dkT5hqsNZuJqIHTjAYZHl5GZ/PJ7su2u127HY70Wi0Ig5p0YDNarXKNAJRePp2dHd3s2XLFu6//358Ph9Hjx5ldHSU+fl5wuFw1TjX3w81LUCiBcVP//RPs2nTJtRqNel0mnA4zNTUlMxg9ng8DA4O0tnZic/nW9eCQxylstmszIAWL+zaXsTRaJSZmRnOnDnDsWPHquYBEJ/B7t276ejokC1ks9ksr732GqOjoxtefODKcSwajbK0tMT8/Dx1dXUyQ9pqteLxeCrWSkUcw4QfyOPx4HK53jaTXoTcf/mXf5nh4WFUKhWf//znuXjxIvPz87fw6m8uNS1Ajz/+OLt27WLPnj1yqqmoh/rIRz7Cjh070Ol0DA4OsmfPHlpbW+URSxSjjo+PMzMzw8GDB2XPH3ijZ7HRaMTn88n0eHHmroZiVIGo5BfWDyC7O1ZDrtKtolQqcfbsWfR6PR0dHdIZ397ezoEDB3j66adJJpO39J6VSiUikQgHDx5kcnKS8fFxDAYDra2ttLa2yup/cexXqVSYzWZ6enrYvn0727dvx263EwqFuHDhQlW11r0R1KwAqdVqtmzZwv79+2lsbJQ+HKPRCMDQ0JCsAevu7qa/v192N8xkMrK15fj4OBcuXODw4cMsLCwQiUSkA9pisciHIZ/PEwwGWVlZIRqNVt0I3bVOcvFf4aS9nVheXmZyclKWMYh+yx0dHVitVtkH6laSzWaZm5uTqRADAwNYrVa8Xi9qtZpkMinvlbBoe3t7aWtro76+nnQ6TTKZJBQKVTz37EZTkwIkmsW3t7fT29srxQfesFw+8pGPrEvi0mq1pFIpUqkUJ0+eZHR0lO9973ssLi4SDodJpVLrfCW5XI5MJkM4HCYUCskXWuSUVIPlIxARL1HzJY6V4lqv5VjfqIhE0Ww2K49bOp1OZoeLBnOVKC9ZXFxkdXWVlZUV6uvr2bVrF9lsloWFBebm5ggGg6hUKnp7e/nt3/5tmpqaADh27BgjIyMbJo1iLTUpQHBlx19aWmJmZoaOjg55BEkmk6RSKebn52WIU/SPEa0/x8bGpJM6Fotds1ZKOKRFi0ugIg/uOyG6IYp+OPDGkEJxRKx0suStxOl00tzcLLPYhcUrZqJVskm9qHQX9YXz8/NySGahUECtVsuJtvX19VgsFtlU/vjx41Xhc7zR1KQAiXYMR44cIRaL8clPfhKn04nFYmFmZobZ2Vn+9V//lampKU6fPi2PXPDuSw9Ek6dqRfh/RF2USqWS1ptoxXo7WUC9vb3cfffdWCwW4IpzOhwOMz09TSaTqWgLFXEPlpaWWF5eJhAISEe5KNPYs2cPQ0ND0peZyWR46qmnePLJJyt23TeTmhQgYZ2cPHmS6elpLl26hMFgQKvVEolEZM1QPB6XR6uN+PKp1Wr6+voYGBjAZrPJOVmrq6ssLCzI4+XtID4i0tTW1sbw8DAOh0OW0og5adU0okdk6gtL1efz0dLSwt69e+nt7UWv1xOLxVheXpbzvzYiNS1AExMTlEol2af5dkOtVtPZ2Ul3d7eM+OTzeVZWVlhYWJAD+ja6+MAbtYANDQ10d3djsVjk0XOtlVFNn4VwPBeLRdk+ZXBwkI6ODjQaDYlEgvn5eekC2IjUpAAJX0w1PUy3GtHz5xOf+ARDQ0Pk83nm5+eZnp7m61//OufOncPv90vH+UbHaDTS0NAghQfe8I8ZjUb59Wr7LMQRv7+/nwceeICGhgbZQub111/nK1/5CuPj4xW+yptHTQoQ1E4biZuFaK4mIjvBYJDLly9z9uxZzp8/z+XLl980SWEjI6J/i4uLXLp0iY6ODgwGA4VCgUgkQjgcJpPJVGUgAa4MPRBtdUUyqWiYV+2jdd4PNStAtzvieHHhwgVWV1eJRqO8+uqrPPPMM1U5B+xmk06nmZub42tf+xpPP/00v/Ebv0FzczPJZJKXXnqJH/zgBwQCgaopn7kau91OQ0ODTCYVNW7VMv7pZqEIUI1SKBTIZDK89tpr2Gw22RNI7PK3G8IvGIlEKJVKPP300zgcDvL5vCwertas8HK5zMzMDCdOnJAN8YLBIAsLCzXVXvW9oAhQjSKiJwcPHkSlUsmcn2o8XtxKxJz4b3zjGzJdQzSjq2YuXryISqWira0Nm83G6Ogo09PTVVPuc7NQBKjGES/WRn5I3wviCForSZjT09OEQiH8fj8ajUZW9m9k6wdAVa7Cu1NN+Rrvhev9SJV11ga3Yp1rp38AsqzmVr6eFWnYVo0CpKCgcHtQ2dGeCgoKtzWKACkoKFQMRYAUFBQqhiJACgoKFUMRIAUFhYqhCJCCgkLFUARIQUGhYigCpKCgUDGqshRDyZy9/r9byTxSJRN6PaLhWa3m9lbiuhULqEap1Yd8I6Pck3dPVVpACteH8sBXF8r9ePcoAlSjrG07Cmz4qumNTjUcqSuBIkA1ilarRaPRSL9DNput2najCtfH7ShCigCBHIerUqnIZDLrJqRWG2q1GoPBwH333ceOHTtoa2sjHA7zD//wDwSDQYLB4G31ANc6KpVKjgLXaDRymOLt0mDuthYgMTfcarXS1NQkxzevrq6yvLxcde0wVSoVDoeDrq4utm3bxvbt22lra2NhYQGbzUYsFtvwD6zCxuK2FSAxX37fvn3s2rWLD3/4w9hsNhKJBE899RR///d/z/LyctXMZFKpVOj1enbv3s0f/MEf0NzcjMfjQa1WMzY2hsPhYGVlpdKXqfAuEb2srx4bfbtsJLeFAAkzt7GxEafTycDAABaLBbPZzMDAAB0dHXg8HoxGIwaDgS1btvCxj32MZ599lunp6apoZq5Wq/F6vdTV1eFyuTCZTKjVapaXl/H7/fj9fqLRaKUvU+F9cLuIzlpuCwESUzN7enro6enh8ccfp66uDo/HI8/euVyOcrmM0Whk165d9PT0MDMzQzAYJJ/PV/woptVqaWpqor6+HqvVikajoVAoMDMzw9jYGBMTE1UhlAoK74YNL0AqlYq6ujra2tr4mZ/5GbZt20Z3dzcGgwGdTkexWKRQKOD3+ykUCrhcLgwGAy0tLfzmb/4mjz32GF//+teZmZnh4sWLFVmDWq3GbDZz7733sm3bNjQaDX6/n2AwyF/8xV9w8eJFKaC3KzqdDovFwq5du+jt7SUSiRAKhThy5Ai5XO4tZ2vp9Xr0ej0tLS0ALCwskMlkqk7MxfFMo9Gg1WrR6XSo1WoSiURNz4C7LQTIarXi8/kYHBxk8+bNmM1m4I3hb8VikWg0SqFQwGq1YjabsVqt7Ny5k+7ubs6dO4dareby5csVcUwbjUZsNhs9PT34fD7UajWrq6tcvnyZ48ePMzMzU3ELrRII69Vms2E2m3G5XGzZsoUdO3YQDAYJBALMzs5KQRECLcb1qFQqjEYjRqORnp4eGX0Kh8NVJUBarRa1Wo1er0er1WIymTCbzRgMBqLRKJlMhnA4/LZRs6vzxoCqiLJteAESvp9du3bh9XoxGAzAGzO5I5EIq6urBINBNBoNnZ2dGAwG9Hq9fEC/8IUv8PLLL7OwsMD8/PwtdfaqVCq2bt3KwMAA999/vzw2njt3jmeeeYbV1dWqn3l1M1Cr1XR0dNDS0sKv/Mqv0NzcjM/nw263YzKZpHP3P/7H/0g2myWbzZLL5SiVSphMpnVWhJhIEQ6HOXToEIcOHeL73/9+xV9QMdu+ubkZp9PJpk2bcDgc1NXVMTQ0RHt7O8VikYmJCb74xS8Sj8evGTRRqVTymTYYDHJUUSaTIZlMVnSNG1qAtFotDoeD5uZmNm3ahM1mk4PqUqkUKysr8igTCATkUcfr9dLQ0CCPY06nk87OTh566CFeeeUVLly4QCQSuemmr8gPGRgYYNeuXdL5LNYgRrcIxK7udDoxmUy4XC45QTUej5NMJuVLWC1cT+Rn7feIDaW1tZWhoSFaW1vp7+/H7XbjdrullSD+jt1up1AokMvlZG6NmL+u1WqlyMRiMUqlEqurq6TT6aqYJ2YwGLBYLLjdblwul/T9RaNRIpEITqeT1tZWurq6eOCBB4jFYkSjUXl/hdAK4dHpdHJjFWtdXFxkenq6YgGMDStAwnrp7Oxk+/btPPDAA5jNZlQqFdlsluXlZV599VUuX76M3++XO8fIyAhtbW309fWxfft2GhoasFgsbNmyhS996Uv8+Z//Od/+9rc5efIkiUTipq5Bq9ViNpv56Ec/ygc+8AHMZrO03PR6vYyECeERL9WmTZtoa2tj9+7dZDIZ/H4/586dY2pqipWVlarzF62dYCoQ17d2fSIVYe/evTz++OPs3LmTxsZG1Gq1tHjEvyP+nji66PX6N/3ccrlMLpcjm80yNzfH+Pg4zz33HLOzs+uurRKo1WocDgc+n4+uri6cTidqtZpIJMLExASBQIDm5mY+/elP09PTwx//8R8TDodZXl4mk8kA0NraKgVMHFfV6iv158VikUuXLnHy5En++q//mnPnzlVknRtagFwuFx/96EcZHh7GZDKh0WhIp9McOXKES5cucfDgQUKhELFYTB5jNBoNIyMjHD16lCNHjtDa2sov/uIvYrPZ0Gg07Nq1C61Wy/T09E0XIJPJhMfjedPLo1Kp6OjoYM+ePVy+fJnFxUVSqRT5fJ5cLsfi4qLcxb1eL93d3TQ2NpJKpeT3Hzt27E0WVCUQP3/tcUcIjdlsxuPxYLPZADCbzXR1dbFnzx66u7vXWbSJRIKFhQVGRkaYmpqSVux9992HwWBAq9XKgEMwGOTy5cscOnSIeDxOJpMhEokQiUSYnJwkkUhQLpdvuRUkjlxiQKFarSaXyzE1NSWdz0KkxfcWi0UZvfV6vVitVulwFxtuOp0mm81SLBbR6XTyBDA6Osorr7xCOBy+ZWu8mg0rQGq1Grvdzp133klXVxcGg0EeR06cOMGZM2c4dOgQ+Xx+3Xhj8eABnDt3jq6uLh577DHMZjMajYb+/n5cLhdf+cpXbvoadDqdfPnWXmepVKKuro6+vj58Ph+lUomVlRVSqRS5XI6VlRWi0SjlcplNmzaxe/duent7MZvNXLhwgcnJSc6ePUu5XH7L6NCt5OqXXLyAZrOZxsZGGhsb5XF69+7dbNq0iebmZvR6PcVikWw2SzgcZmpqiueff55XX32Vzs5Oent7GR4exmKxoNPpKBQKZLNZZmZmOH78OH/7t39LNBolmUyue7GvPtbeCoTFJqwUnU4HXBm9HQqFyOVyUpTr6uowGAzr/Jl6vR6j0YjD4ZBpI8K6i8fj0j9kMBgolUpEo1HGxsYYGRkhFovdkjVeiw0pQGLnb2lpYXBwEKfTCVw5Xl26dIl/+Id/YGlpiXQ6fU1Ho/j/paUlAEZHR8lms/T09GAymXA6ndLPcLN2SFGXFgwG+drXvsZzzz3HgQMHsNvt2O12wuEw6XSa1tZWisUi4+PjZDIZKSgqlYrx8XEWFhYYGxvjwx/+MHv37mXfvn0MDAwQDoc5f/48p0+frroi1lKpJEWlubkZnU7Hgw8+SH19PR6PB7PZLAMHoVCIp556ikAgwPj4OKurqyQSCebn5zl16hQjIyPo9Xp0Oh0LCwskEgnpfF1aWnpT3d9bPQs3E+EkFiJZKpVk5C4ajcqjpYjW/fzP/zytra3U19dLv1exWJQ+oGPHjjE3N8exY8eIRqOEQqE3RW+F1RiLxcjlchUL5W84ARK7Z09PD5s2bcJut0vVFzu/3+8nHo+/47+Vy+XkSy1ukPCzCJ/EzXxAi8Ui6XSaiYkJIpEIDocDp9OJy+Va5+sQiZRXP2TpdJp8Pi99XvF4HKfTKX1asViM8+fPrwtRVwvCya5WqzEajTQ3N9PY2IjRaASu3Jv5+Xnm5uYYGRlhcXGR2dlZWcyZzWZJpVIUi0VpVSwtLZFMJiu8srdHbIjieSsUChiNRsxmM5s2bWJgYIC+vj4ZJFlbQR+NRpmammJ0dJTLly9z+vRp4vH4NUP01eBkhw0oQOLY8p//839m27ZtWCwWSqUS6XSab37zm/zoRz+STrrrQa1WU19fL2/2rTDJhfM0l8uRy+VkhOKVV17BYDBgt9vlsSoYDEof1rWiW4VCgUQigd/vZ3x8nHvvvRefz8dnPvMZdDodx48fJxQKvavP5FYgXkKbzUZTU5P0Y4kXLRQK8fLLL3P69GnOnj0rfRziCC3alCwtLVEqleSfVSMiJC5yj9Y64NVqNd3d3XR1dfH//X//H42NjbIGUIhIsVgkmUwyOjrKv/zLv/DKK6+wsLAgfXzXWne1fBYbSoBUKhW7d+9meHiY7u5uXC4XABcuXOCVV15hamrqXe/2arVaJifeSq7lnBVFi6VSidnZWfmyrY36vNXDFggEOHfuHJlMBq1Wi8VioaGhgd7eXjKZTNUJEFy5br/fz9mzZxkbGyOVSmE0GolEIqysrGAwGKivr8fr9ZJMJuWf6/V64vG4tFyr7Yh5La6+PpHnNDg4yPbt2+no6KCxsRGbzSYjWWufi1QqxdLSEuPj40Sj0aqLdL4VG0aAhNXw4IMP8vjjj9PS0oJer6dUKnH8+HH+z//5PywsLLzrHBi1Wi2PLVdzs27w2yXACYtmamqKcDhMb28vgIyIXGt95XKZ6elpYrEY6XRaRlSamprYunUr09PThEKhm7KW98vExAR+v5+dO3cSCoXweDzEYjGCwSA2m43Ozk6Wl5eJRqOsrq7idruxWCyMj4+vOzrXAmujgGq1msHBQX7+53+effv20dzcLNMN1j4fIi0jFosRCAS4ePGi9G3WAhtGgGw2Gx0dHXR0dNDQ0CDDjfF4nNXVVVnj825Y6+cRvxYWFpiZmam4xSBCynV1dZhMJoxGI8FgEL/fTyQSIZPJrHsIc7kcqVRqnUBFo1EmJyerpuXItRB+sH/6p3/CbDZjNpulL2zv3r10dnZiMpnI5/NotVrcbjdGo5E/+ZM/uS4/XzWi1+tpampi06ZN7Ny5E6fTSblcJh6PEwqFmJiYwOv14nQ6MZvNRCIRnnnmGWnhVlOi6TuxYQTIYrHQ3d1NfX297G6YSqWYm5tjeXn5PTkfxU5ULpelc3NpaYnJyck3veC3GuHXMJvNOJ1OjEYjGo2G1dXVa754a1MMhB8rlUoRDAarqu7pasQOPzY2JgMMjY2NtLe3s23bNrRaLV6vF71ej9frxeVyodPpMBgMNWMFXI1Op6OxsVFG/UTJ0NLSEouLi4yOjrJp0ya0Wi1arZZEIsH4+DjLy8tV7eu6FhtCgFQqFf39/Xzxi1+ktbUVuHJUuXTpEp/61KcIBoPv6d+1WCxYrVYZ4lxdXeUb3/gG3/3ud1lcXLyRS3jXCItGdEncvn072WyWl1566Zp+LpvNRl1dHVrtlVteKBRYWlrizJkzVZELdD2IvKVAICD9HTabjd27dzM4OMgdd9yB2WyWm0Uul6v0Jb9rRN+nn/7pn6anp4dgMMjhw4cZHx/nhz/8IZlMBrVazS//8i/T0dEhQ/AXLlxgYWGhpsQHNoAAifwIu91OU1MTZrOZYrHIqVOnOHnyJIFA4D0fMfr6+hgaGsJoNJJOp5meniYQCLCyslLxAtBisUgul2N6epp0Ok0ikWBubu4tczpEUp7NZqNYLBKJREgkElVt/bwVwhkvqtYXFhZoaWmR2epCgGvtZYQrm57T6aSlpQWj0UgoFGJsbIxz584xPz9PuVzGarWiUqkwm83o9XrUarWs86s1al6A1Go1brcbr9eL1+tFrVaTyWT4i7/4C06cOPEmv8f1olKp+OQnP8n999+Py+ViYmKCl156iampqapw8hUKBQqFAgcPHpRHRfFiXouPf/zjfPazn8XtdpPJZJicnKz5Fq5CbMfGxmhvb8fpdEoxrvT9eS+o1WoaGhro6Oigr6+PcDjM2NgYL774ImfOnCGbzcqjtrBoRZJlKpWqyc2kpgVIp9Phcrn43Oc+x/DwMGq1Gr/fLwsL30vUC6C/v5/du3ezY8cOWlpa0Gq1RCIRjh8/XnVTJ8T63irU7HA46OzslBaCWq0mGAzy1a9+lbNnz97qy70prC1fePbZZ3n55Zff87G7UqhUKnQ6HQ899BBbt27F7XYTi8VkjV+pVJLH7c2bN1NXVyenaORyOZkZX2vUtABptVrsdjsf+9jHaGtrQ61Ws7i4yIULF/D7/e86tCyOc729vXzoQx+it7cXr9dLPp8nEolw6dKlihbuXYu14dirEVM/hoaGaGxsxGQykcvlWFpa4gc/+EFFa4BuFKJwVRxFzp49yxNPPEEkEqn0pb0rhADt2rWLrVu3YrFY5NFKtBEROWmbNm3C7XajVqvJZrOk02lZVFtr1LQA6XQ6jEajrJgul8v86Ec/4u///u9ZWFh4V/+WSqWivb2dL3zhCwwMDLBlyxasVivJZJLvfOc7vPbaa/j9/ppx2KrVaurq6ti3bx+/93u/h9frJZvN8kd/9EecPHmSWCxWM2t5O4xGIx/4wAfYvXu3jAgtLS1V3Ef3bhGFpvl8nmQyyfLyMqOjozz33HOEQiG0Wi0ul4tt27bxsz/7s/h8PjKZDN/73vc4deqULDupNWpagEQLzbVZwKFQiNnZ2et+AEXavsfjob29nS1btuDz+bBarfj9fpaWljh16lTNNX3X6/UMDg4yNDREZ2cn4XCYhYUFzp49y8WLF+WuWuuI/kctLS0yZF+Lzli44lyfmZmhXC6zuLjIpUuXmJubI51Oyw2lvr4en8+HTqcjHo9z4cIFxsbGai78LqhpARLVvMIBp9Pp3vVNEF3nHn30UQYHB+nu7kalUhEMBvnyl7/MkSNHmJycrDlrweVy8Ud/9Ee0tbVRLpc5ePAgzz33HIcOHWJlZaWmktXeDrPZzMc//nHq6upIJpM1d58EohbsD//wD9c1aCsUCrKv+R133MHw8DB2u51AIMD8/DxPPPGEFKBapKYFCK7sGsFgEIvFQn19PTt27ODRRx/l4MGDrK6uXvPvaLVanE4nXV1ddHR04PP52Lt3L01NTWg0Gi5evMixY8c4d+4cS0tLVdfG9J3YunUrfX19eDwe8vk8Z86c4dSpU5w+fZpEIlFTa3k72tvb6erqwmKxUCwWWVxcrPpq97dDdGi8GpH9vXPnTvr6+uQwyldffZVwOFxzx821bAgBEqOJm5qauO++++js7OTs2bPr+uOuzQI2Go20tLTw0EMPyVE3drsdgHg8zsmTJ/mTP/mTqqwSfydUKhX79+9n7969OJ1OFhcXeemllzh69CjHjh2r9OXdUPr7+9m2bRtGo5FMJsPs7OxN71JZCUwmE16vl3vuuUe2oD1z5gzf//73WV1drcmjl6DmBSibzXLkyBGy2awMT9psNv78z/9cNu9eXFxkbm4Ol8uF2Wymra0Nt9tNR0cHdXV1OBwOtFrtugbuogtdLWG323G73dx1113cc889mEwmZmdn+eu//ut37ZSvdlQqFT/xEz/BQw89hMViYWRkhC9/+ctMTExU+tJuKCqVis9+9rPcddddNDc3yxYjfr+f+fn5mntGr6bmBahQKDA1NYXP56NcLsvyiQMHDpDP5wmFQszMzDA+Pi7FqaurC7PZjMVikT14y+WyzKpdXV2tOcsHruT89Pb20tnZic/nI51OEwwGGR0dreld8lqoVCo6OzsZGBgAIBwOc/z48ZoKFLwTIsF08+bN7Nu3D6vVSiQSYWFhgVAoRDwer/n7WvMCJCwgm822rh+MWq3GYDDQ2NiI1+tl27Zt6/ruri00FU2uRkZG+NznPldzSWyCD33oQ/ze7/0eLpeLTCbDM888w7Fjx2r+Ib0anU4nR87AG90Ta60S/J3Q6/VyqoXD4UClUjE5Ocmf/umfcvr06Q1xX2tegMrlMslkkpmZGQ4ePEhbWxuNjY243W50Op0cQCf67RaLRenbET10xZFrcnLyfdWOVQrRsN3r9VJXV4dKpZK9gSs1Tvpm0tDQQE9PDw6Hg1wux5kzZ7h06dKGEh+ApqYmBgYGZNKheNanp6drttXI1WwIAcpkMhw/fpzf+I3f4GMf+xh33XUXd955Jy6XS7bmgCvtJ1KpFCMjIwQCAc6ePcvp06eZnJyU0a5a3FWMRiMDAwM0Nzej0Whk5vbf/d3fbTjfD8CWLVv4hV/4BTo6OojH4/zpn/4pFy5cqPRl3XB27drF5z//edl0TkyzuHDhQs1tkm9FzQuQIJ/PEw6Heemll7h8+TIHDx6Uc7/hjUkL+XyelZUVEokEoVCIlZUVIpFIVczIei9otVp8Ph+/8iu/wtDQEOVymVdeeYVz585tmIf0ajweD/39/VitVtLptJxuWyuIaKzoWy3q+NZ2RBRWrc/nk+N3RLFxrSYdXosNI0BrG3OfP39eNvQWbVnF8asW+gNfD6IA02q10tzczMc+9jEsFgvlcpmzZ8/y6quv1nyE5K2w2Wy0trZiMplIJpNyuGStcLUAwfo2vGISiN1ulzPA4I0mdBvlGYYNJEACcSNF9fDVQwc3CnfeeSfDw8M8/PDDtLW1YbVa5cTLo0eP8swzz9RkJO+dEGORRKBBlCjEYrGaKUAVz6YYGWQ2m6WfUqfTyaTDHTt2yPYbgHyu32oCSi2y4QRoLRtNdOCN0Gxrayvbt29neHgYr9cLII+VwWCw5hPUroWofBdzzsXXzGaztBJqBSEmwrksvgZXrFsxCEFEbK+mWuZ6vV82tABtRMSOeeedd/KpT31K+rjK5TKTk5O88MILNdma83oQvZIdDodMqRBTS8QI61pC+HOy2awUGdHRUfQxXys+4tim1+vJZrMbwgpSBKjGEDtnLpcjnU5LkzwQCHDy5EmOHj36ljVwtU6pVCKZTBIMBpmcnMRoNBKNRpmfn6/57o5rZ3yl02kuX75MIBAgmUxiNBpRqVSEw+ENE34XKAJUY6ydhBkOh7FarcTjcY4dO8ZLL73Es88+SyqVqvRl3hSKxSLRaFSOY7bb7cTjcS5fvszy8nKlL++GIHJ9zp07x44dO4hEInIq79LSEuFweEM5oVXlKlzJrRh/fDO53o/0vaxTHDu2b99Ob2+vnI6xtLREMBhkYWHhloVpb+Y63woxKlsMniwUCoyMjLxlM/4bQSXWqdFo6OjokImIGo2Gubk52fHzZohQJaRAEaCbwM1+YFUqFV6vF5vNRiQSIZvNyjKEW3k7K/Firv03xb97s30hlVqniPZ5PB70er3syHmz7rMiQAoKCrcV6kpfgIKCwu2LIkAKCgoVQxEgBQWFiqEIkIKCQsVQBEhBQaFiKAKkoKBQMRQBUlBQqBiKACkoKFSMqqwFUzKha4Obtc6rK8DX/qxK5M3Wyv1cO6L8vbSiqcRnW5UCpHB781YvgpK0/9aIGkGNRrOuCV+1owiQQlUjXizxe1jf9bIWXrKbiVarxe12c8899+B0OnE4HJw5c4ZAIMD4+Dj5fL7Sl/i2KAKkULWI5uwajQatViuPF8VikUKhsCE7Xr5bLBYLjY2NPPDAAzQ0NNDY2Iher+f8+fPMz89TLperena8IkAKVYkYKNDf309nZ6ccLDkxMcHly5e5ePEisVhswzbefydEK9o///M/Z+vWrTQ1Nck5eK2traysrNDX18fIyAg/+tGPqlaoFQFSqEo0Gg1Go5H6+no6Ojro6+ujUCgwPz+PVqu9rY9fKpWKtrY2WltbGR4epr+/X06JhStWkdPpZNu2bQCMj48TDAaJRqOVuuS3RBGgDc7bRWaq9QVWq9WYTCYaGhoYGBhg9+7d9Pb2srCwwOnTpwkEAoTD4aq9/puJGDn+67/+6/zSL/0SVqtVNuhfi9ls5hOf+AT79u1j9+7d/M3f/A0//vGPq66PtCJAGxStVktTUxMej4fe3l45OUI8rJlMhmQySSwWY2JignA4TDQarYqhd2q1WvZ/Hh0dJZFIcObMGaLRKIuLi8Tj8Ypf461Gp9MxODhIU1MTQ0ND7Ny5E5PJRDqdZnV1leeff55sNksul8NsNmO1Wtm3bx/FYhGXy4XT6ZQtbG9W58j3giJAGxCVSoVOp2PTpk1s2bKFRx99lLq6OpxOJ0ajkVKpRDgcZmFhgdnZWb73ve9x6dIlUqmUdPJW8to1Gg3FYpFIJMIrr7zC0aNHKZVKlEqlmp1g+34xGo3cc8897N69m5/5mZ8BrliwkUiECxcu8Ad/8AdEIhGSySRer5eWlhZaWlqw2WzYbDacTidOp5NUKlVVAlSVHRFvZEKXaN35fhK03i2VblW6fft2urq6ePzxx2lqaqKtrQ2j0Yher5fDGmOxGMFgEL/fz9zcHMvLy/zwhz9kaWmJmZmZ6/pZN2OdQoCu/jniZ1XiCFHpRESbzUZzczNf/epXaW9vp6WlRY7ffu6555ifn2dkZIRCoUChUMBgMGA0GhkcHESr1VIul+U9TqVSb/kZKomINwERytVqtTKfROwAa4e7bYQ5SwaDAZPJRFdXF4ODgwwMDGC326XTNpPJkMvlyOfzRCIR4vE42WyWxsZG7HY7Pp+PXC533QJ0MxD5PeL3Vbg/3nLq6uqkI97tdpPL5ZiamuK1117j+eefJxQKrfv+VCpFKpXi5ZdflnlUlRTwt2NDC5A4ini9XpqamvD5fJjNZhKJhJyyWSwWyeVyHDt2jGg0WrNhXY1Gw7Zt2zhw4AD33HMPDQ0NnD59mmQySTQapVAokM/nuXjxIqurq8zNzWE0GrFarTzwwAP4fD7uvvtumchWyRf/7V6Sq0szNjoqlYrf+Z3f4YMf/CBer5dYLMb4+DjPPPMM3/nOd95x/Halj9TvxIYWIHhjRy2VSjQ1NdHU1LQuuS2dTpNKpaSzs9bQarWYzWa6urro7u6mrq4Ov99PIBDg/PnzpFIpEokExWKRfD7PzMwMiUSClZUVGa7V6XS4XC7sdjvBYFD6YKrhJV87MVSj0WAymTCZTFgsFtxuN1qtVg7wCwaDFb7aG4tarUan0+HxeKivr0etVhMMBjl06BATExMbYv7bhhYgkQWaTqeJRCL09vaye/duhoaG0Ol0lMtlwuEwwWCQZ599tiana5pMJlpbW/nMZz6D0WhEq9Xyz//8z1y4cIHFxcW3FZJsNgtcMfEHBgawWCzE43GMRiPZbLYq0vjFEcJkMmE2m/H5fLS2ttLV1cWdd96J3W7n+9//PmNjYzz//PNVd8R4P+j1eiwWCyaTCb1eD8D58+f54he/uGHWuaEFCK6IUCaTkdECnU4nowMGg0EewfL5fFW8cO8GjUbD0NAQra2tZDIZJiYmuHTpEpcuXSIcDr+jFePxeBgYGKCpqQmn04ler8dms+F2u1ldXa3456FSqejo6KC5uZmBgQFcLhdNTU14vV4aGhrw+XwYjUZ+8id/kuXlZfbv38+ZM2cYHx9namqKdDpd0et/v+zYsYOPfOQjdHV1kUgk+MY3vsGrr766YcQHbhMByufzJJNJpqamMBgM3HXXXfIIls/n5W5fzTUzVyOc65s2baKxsZFUKsXk5CQvvPCCrJN6p7/vdDrp7e2VQw5VKhUmkwm73V7x46iwfJqbm9myZQv33nsv9fX11NXVYbVasdlsaDQaVCoVDoeDXC7HHXfcgd1uR6VSEQwGpd+rVunt7eWxxx6jqamJdDrNE088wfj4eKUv64ay4QUI3vADTU5OEo1GUalUNDU10d/fj0qlIpfLkcvlKt7P5d3gcDjwer18+MMfRqVS8aUvfYnFxcXrEh+tVovL5eLOO+/kc5/7HK2trTJSZjAY8Hg8hMPhW7SSayMiYC0tLezcuZNt27ZhtVpJpVIEAgEWFhY4deoUwWAQi8WCz+dj586dbN26lb6+Pnbv3s3ExATf/OY3yeVyNWk12Gw2OYI6Eolw8eJF5ufnK31ZN5TbQoDEru71enE6nQSDQfL5PCqVCqvVKkP0Wq22Zqwgn8/Hpk2baGlpIZFIsLS0RCwWe0fxMRqN2Gw2BgcH6enpoaWlBZPJBFyJPqnVaux2OwaDYV2aQqWwWq14vV7sdvu6zN9Lly5x9uxZlpeXsVqthEIhzGYzLpcLi8VCW1sbAI2NjYTD4ev6bKoNjUaDwWAArghyMpmUfruNwoYXII1GQ3d3N3fccQf79+/H6XTygx/8AL/fz6uvvkpnZyf19fUYjUZcLheLi4s18aA+9thjfPrTn8Zms3Hu3DkSicQ7PpwajYaOjg4GBgb4nd/5HRoaGjCbzTI5MZ1Oo9Pp6OrqIhAIMD8/X/HMY7fbTVtbG1arVZZojI2N8c1vfhO/3088HgeulCo8+eSTMlHvF37hFxgeHuaRRx7h9OnTvPzyy9dlHVYT4hgqEgw3IhtagCwWCwMDA+zatYsDBw7Q1dWFTqejo6ODQqHAxYsXWV5eJpPJ1NTOolar5W6v0+lkPo8QkWtht9ux2+0cOHCA/v5+6urqUKlULC8vc+7cOZaXl1ldXWVxcZGLFy8SiUQq3l2vXC6j0+kwmUzSSrXZbLKuTeS4rE21WFhYkL4fr9crc5v0ej2nT58mGAxW/XHMYDDQ0tKCx+ORX7vR7oFqyafa0AJks9m4++67uffee/mJn/gJVCoV6XSavr4+aTEsLi6ysrJSM5nQolTBZDJhs9mAK0dM4Yh9KwFyuVy0trbyyCOP0NPTg8fjYXV1lUAgwLe+9S1OnTrF/Py8FBxxHBNWR6UFSKPRoNPpZNc/m80mywzE9Yr7GYvFyOfzOJ1OduzYISNpKysrrK6uVv19NpvNbN26lZaWlnVfv1EitDY7WhGgm0g2m2VycpKuri6i0ShqtVpGwwqFAvfccw92ux2j0cihQ4dYWlqq9CVfF2LHLxaLqNVq3G43P/dzP8eRI0d45pln1r1gQrBaWlrYvHmzjHatrKzwyiuv8NRTT/H666+zuLhIOp2WD6RGo5FNzivpnC+VSlIURfZ6e3s79957LwsLC2/K3RIvl8fjwev1SrF2Op3rBKtaUalU1NXV8ZnPfIZNmzYBVyxeo9HI7t27GRsb49KlS+95HZW+n1ezoQUon8+zvLxMIBBgZmYGjUYjoyjlcpn+/n7cbjdGo5HXXnvtmn1VqhGR2xSPx+VxZPPmzSwuLuL1ekmlUuTzeSlQWq0Wu91OXV0dOp1OtrqYnZ3l+PHjBAIB6UsRiBf56vq5W002m11XQCmOYSIH6GqE2IijmnCkFwqFqsnufif0ej1NTU3o9XpWVlYolUrEYjFZq5dIJKRVmk6nZZrJ9VBN4gMbXICSySQnT55kdHSUr33ta1L9DQYD9913H7/1W7+FwWCgVCrx/PPPE4vFWFhYqOqHVPg9Xn/9dWw2G/fffz92u53BwUHcbjd33HEHBw8e5PLlyywuLgJXIl9erxe9Xi+PaPPz88zNzcnj59UIn4per5fWViVM9rGxMV544QXq6+vXRYTeSkwefPBB7r33Xjo7OzGbzdJp/S//8i/4/f6qrouCK5bn4uIi//2//3fMZjMmk4l4PI5arWbv3r0cOHCAnTt3AlAoFPje977H6Ogo3/rWt64r3UAIULV0lNzQAlQul2WOz9poSXt7O2azWdZB5fN52a6iVgiFQkxPT7O8vEypVJL1XHq9nng8TmtrK4uLizLDu729Ha/XSz6fl5XxcMVRLyymtQ/kWlN9bTuTW43f7+fs2bNs3bqVWCyGzWaTfYwymYz0ZRgMBlwuF729vWzevBmz2SxD16FQCL/fXxOZ0cK6nZycxGAwoNPpSKfTaDQaXC4XWq2W3bt3y0z+bdu2YbfbSSaTxONxEokEgUCARCJBLBajWCyuE91Kzle7FhtagK6FRqOhq6uL1tZWmf9SLBYxGAw1JUCBQICzZ8/S3d1NS0sLbW1tOBwOOjo62Lp1K6VSiUgkQjgcZnp6Wq5PRPwymYysI0ulUuuc8Fcfv4CKHcVEaYVoM9Lf38+xY8c4fPgw4XBYOsk9Hg+7du3i7rvv5r777kOj0ZDNZllaWmJ2dpaxsbHrPqZUkmKxSCKR4OLFi2/6s/Pnz7Nnzx46OjoYHByko6ODhx56iHK5zL/5N/8Gv9/P5OQkTzzxBBcvXuTcuXOyNYfSjqNKMJlMfOhDH2Lr1q3yayqVivb2dkKhEMePH6+6m3QtRDLls88+S1NTE9u2baO7u5vBwUGsVit6vR6HwyGbU4lRNoFAgGKxSDqdprGxkQcffJChoSFisRjxeJxkMsnq6irLy8tEIhFpGVXKf5LP50mlUhw+fJhTp07h9XpZWVlhaWmJuro62tvb2bdvn6wXGxwcRKVSkclkCAQC/PM//zPHjh2Tll+lEEd/o9FILpejWCy+69SPYrHI5cuX+epXvyrLUj7/+c/j8/nQ6/XymGo0GqWFfPbsWb7+9a9XzZHram4rAVKpVDKa0NXVte7rjY2NNDY2Vp2T7q0QDcVKpRINDQ3o9XpMJhPd3d0yu1ur1WIymbBarSSTSZLJpDyWplIpnE4nPT099Pb2kslkWF5eJhQKMTk5SSaTIRQKVTwRUYTXz549C1yxYMVaBwcH6erq4tFHH8Xn89HW1oZKpZJOW7/fz8GDBwkEAhVfh4hkOZ1OmQKSy+Xe1TWVy2UWFxd5+umn0Wq12Gw2Pv7xj+NyubBardjtdhwOB21tbRSLRRYWFnA4HHzjG9+4iSt7f9xWAiTak4oXVIR3i8WiFKC187WrGbGjhUIhedwQvqC17WcLhQLZbJaZmRnm5uY4duwYq6urxGIxGhsbaWtro6enh46ODnbs2MH09DTxeJzx8fGqzBwWPo1CocD+/fvZvXs3/f39WCwWVCoVi4uLLC0t8ZWvfIWxsTFGR0fJZrMVjYDpdDrq6+vp7+9n+/btnDlzhoWFBS5duvSei2VFW91f/dVfZfPmzfyP//E/8Hq9uN1u4MrnJJJLqzn6d9sIkEqlkp0RxUuZyWTkPCWn0ykzi0XIttoRlf7FYhGHw4HVakWn08kjVjweJx6Ps7y8zMTEBHNzc4yPjxOLxUilUrJFq8lkolgs0t3dLXOlKn1keTvE0MK6ujqam5vlEXNpaYnx8XEmJyc5e/Yss7OzMmRdSTQajezrPDg4SDKZRKvVMjk5+b6q9YvFIpcuXaJQKHDixAn6+vowmUzyGQiHw1U/QeS2ECCRC7Nnzx62bt0qk+6y2SwulwubzUZfXx+ZTAa73U6pVKqZbnOlUgmr1cq9995LW1sbZrOZaDRKNBrl0KFDnD9/nqeeeopoNCo7I4oHcm3Li+bmZn71V3+V6elpjh49WtXN2axWKz6fj5aWFpqamsjn88zPz3P48GGeeuopXnvtNZLJZNXs/AaDgY6ODvbs2cMjjzxCf38/4+PjHDt2jEwm876usVgsMjExwWc/+1l+9md/lt/8zd+ksbGRUqnE2bNnmZycvIErufHcFgLkdrvx+Xw0NjbicDiAK9bD2miPwWDA4XDQ19fH1NQU09PTlb3o60ClUuFyueRMcLvdDsDMzAyzs7O89NJLzM7OEgqFpLVzrYc9EonICJnoi1Rpq+GtEJbstm3bqKurw2AwkEqlmJ+f5+jRo8zMzFSV+AjK5bKc9trY2Egmk5Fz2t5voakIEoijqfhZbW1tzM3NVbVL4bYQoIaGBnbt2kVLS4sUIFGisPaXqB3K5/M1I0ANDQ20tLSsywweGxvj5MmTPP3009c1iC4Wi6HVaslms1Xp91mLWq2mpaWF/fv309TUtC7i8+yzz5JKpaquclyUzohImM/nA5BFtu/nesUmajAYUKvVFItF6QscGBhgfn5efr0a2bACJOqGGhsb2b59Ow899BA+nw+LxUI+n5fTI4XTFq4UbH7sYx+TUzKq2RKAKy/jtm3b2LZtm8xhyuVynDhxgsOHD5NOp6/r+nU6newc0NDQwNTUFMePH2diYuJmL+FdYTQaaWlpYdu2bXLyh0gkzWazJJPJqhMfuDKFVsxnhysN4erq6vjyl7/MK6+8wl/+5V+Sy+Xe1bWLQYOPPPIIra2ttLW14fP5aGhoQKVSkUwmZTa8TqcDKldO83ZsaAES2cGNjY10dnbKtgyZTEZmPq+t/zIYDHR3d9PY2IjZbCaZTFa1AKlUKtxuN263W06yENEPv99/3aFn0aytrq5OjnS2Wq23YAXXj9jlW1paaG1tlZ0CC4WCzF+q1g2jWCyyurpKPB4nl8uh0+kwm83cc889lEolvv/978s/X1t4q1arpXUu/JiiSVldXR319fXcdddddHd309XVJaOe8XicWCzG7OwswWCw6gpQ17IhBUj0NnY6nXR2dtLc3CyjXKKPjIikrB3aBm8UAg4PDzM6OsrKykrVHknK5TKxWGxdyr1osH+9u51areZTn/oUe/fupaWlhUwmU3XHGJG/1d7ezr/7d/+Ovr4+jEYjKpWKeDzOE088wdmzZ6tSfOCN7GYxwbS7uxun04nb7eahhx5iYGCA//f//h9PP/20bAJnNpvxeDwyVUK0U/H5fOzYsQOz2Sz7QGk0GpmSEQgE+M53vsPFixc5e/asFLVq/Ww2pAAB8mUU5QihUEiekwGZ7yMQCXrhcFjupsJRXc0CNDc3h8fjIZfLyUb7nZ2dDAwMcO7cOTKZzDq/jthVxVwtn8/Hli1b6OrqolgsEo1GmZ2dfVN1fKUQs7G2bNnCwMAAHR0duN1uVCoVq6urLCwsMD4+TiAQqPSlvi2lUonZ2VkOHjzIzMwMDQ0NDA8PYzQaaW1tZefOnbJ7Q6lUwmg04nA4cLvdNDQ0YLVaaWhowOv14vP5pAM7Ho8TiUQ4efIkq6urrKyscPbsWWZmZggGg7JAtVqf4Q0pQOVymWw2SyQSYXR0FIfDQWtrq4ygCHNW5L+IFhVi6uTMzAzLy8tks9mqvXFwRWRfeuklYrEYn//852Wjrg996EN0dHTwZ3/2Z6yurq5r36DX6zGbzXR2dnLPPffw0Y9+lJ6eHmw2G8FgkKmpKV544YWqmRCr0+mw2Wx85jOfYevWrXKmG8ClS5c4f/48R44cqXgT/evhyJEjHD16lPb2drq7u/m///f/0tDQgM1m46d+6qd49NFH5XHp7XoxiWeyWCwyMzPDyZMn+bVf+zUymUzVWjpvxYYUIHhjKGEikeDcuXOk02nOnz9PfX09ra2tOJ1OWltb8Xg8OBwOGakQSYrVdgx5K0RN0erqKnq9HqvVyqZNm6irq8Nms5FIJGQRpojC6HQ67HY7TU1NtLa2UiwWCQQC/NM//ROnT5+umikSarWahx56iP3793PHHXfQ2Ngo24Ok02leeOEFjh49uq5fULVTLpdZXV0F4Ktf/Srbt2/n4Ycflpvi2g4EV/890ZB/cnKSCxcuMD09zfz8PIuLi1Vzz94tG1aAAJlQOD09jd/vx+/309jYyPDwsBzRbDAYsFqt6/qkCBGqZtNVIARoZWUFu92ORqPB5/Ph8/kYGBiQxZxiN10bERF9fkS7ih//+MfMzs5WRbREOFvvuOMOPvnJT9Lc3CwjfYVCgVwux5kzZzh69Og7zkevNmKxGOl0mieffJJEIsH999+PXq9fJ0BrxUQ8h9FolLm5OV5//XUOHjzIiRMniEQiNbFRvhUbWoAEorRicnKSmZkZRkZGZFe9n/u5n+PBBx/E7Xaj1+tlj6BasYBEgeKf/dmf8fGPf5z29nY5sE/8V6VSyUxo4Rs4fvw44XCYlZUVFhYWiMVizM/PV0VzfpVKRU9PD48++ij33nuvFB/xcq6srDA5Ocni4iKJRKLqN4lrUSgUGB8fZ3FxkRdffBG3243NZsPlcpHL5ZiZmXlTaF40208kEsTj8Zp5Rt+O20KA4A0TFiAejxONRkmn0wQCAVZXV7Hb7RSLRdkvp9oT8tYi8kzOnz/PmTNnsFgsaLVa+YDmcjnZbmN6eppgMMjp06elAK2srFx3ztCtQK1W43A4GBwcXNcJEa5YAysrK4yMjBCJRKrCWnsviOcxnU6zvLws28g6HA4KhQIzMzM1N633vXDbCNDViAkKFy5coKmpCZfLRT6f58yZM3I6RK0J0OLiIt/97nfp6+vDbDZz6tQpksmk9JEIU174u8Tvq2mdIuReX1/Prl27qK+vl38mim9fffVVvvSlLxEKhSp4pTeWSCRCNBrF7/dX3T25mdy2AiRewpmZGU6cOIHT6aRQKHDq1Cnm5uZqwv+zlmKxSCqVkkdNvV5PKBSSLWlrZS0ieLC4uMihQ4fo7e3F5/PJyv7JyUlefPFFYrHYhrMObifhEajKVbjiW5m16fV68Xg83HfffRSLRZ588knZW/e9cr0fabVmp14vN2udKpUKj8dDf38/+/btY2BggNnZWfx+Pz/60Y9k8uWtOjIq9/PmcdsLkF6vl+0sS6USi4uL73sUrvLArue9rFOv10unrNVqJZ1Oy66Nwhl7q1Du583jthcg8fNuZNN15YFdz/tZZzWMEFbu582jKgVIQUHh9kD9zt+ioKCgcHNQBEhBQaFiKAKkoKBQMRQBUlBQqBiKACkoKFQMRYAUFBQqhiJACgoKFUMRIAUFhYqhCJCCgkLFqMpqeNHlv1aTtJXU/fUo66wNKvG+VaUAVUtjLAUFhZtLVQqQgsKN5uoRTArVgSJAChsW0Q9bTBcVIpTP52WHSIXKogiQwoZEr9djsVjo7u6mp6eHoaEhzGYzhUKBQ4cOMT8/z/nz5xVrqMIoAqSwoRDztfr7+/F4PGzatInu7m6GhoYwmUwUCgUikQherxeAhYUFOadL4dZTlf2AbpdogrLOG4/oovi1r32NzZs343Q60Wg0suEcIEXo8uXL/Mmf/Anf/va337YRXTWu82agRMFuAmsnTa4d+laFunvD0Gq1GI1Gtm7disfjwefzrev6mEqlmJqaYnFxkbm5OTKZTM2OtxEIX09vby8DAwPU19djNptRqVQUi0Xy+fy6ZyGTyRAIBEgmkxv6Wah2NrwAwRWzfO2Qvnw+Lx+6az18tfxAivHLbrebj370owwODnLPPfesm7q5srLCD37wA44dOyabu2cymZpet1qtRq/Xs23bNh566CHcbjdqtZpMJiN7SIt56xqNhmAwyOjo6IYa7SNYu9lWez7dbSFAFosFl8vFli1bqKurw2g0Yjab5fx0s9mMRqMhlUoxPj7OmTNneOWVV2rKMtBoNGzZsoW77rqL7du309zcTHd3NzabTY6eVqlUlEolnE4nu3fvxmaz4Xa7mZmZYWVlhRMnTpDNZqv6gb0WarWa1tZWPvCBD3DvvfeyZcsWLl26xMLCAk899ZRsaC9QqVQkk0mWlpZYXl7eMNEwcY8dDgdGoxGDwSAb+VerEG14ASqXyxgMBlwuF729vbS3t8t58I2NjdTX1+N0OjEYDCSTSTweD6lUinPnzsmRztWM0WhEp9Nhs9no6+vjjjvu4K677sLn86HRaIArn8Ha44dWq6WhoUFOgDWbzQQCAUZHR9/3RJBbjVqtxul00tbWxo4dO+js7MTtdnPq1ClGR0d57rnnSCaTpNPpt7V6qw1xZF5bFVAsFt/22sXfcTgcOBwOTCYTkUiEYDBYtWve8AKkUqmor69n+/bt7N+/n9bWVi5fvkwsFuOVV15hYGCAjo4O2tracDgcDA8Pk0wmWVlZ4ejRoywsLFTtDqnVarnnnnvo6+vjsccew+Vy4Xa7sVqt5PN54vE4hUKBbDaL2WxGr9dLgVGr1bS1tdHa2sq2bdsIBAKcOHGipqJCOp0Ol8vFf/pP/0mKb6FQIJFI8NJLLzE6Oko4HK6pKbfiiGgwGPB6vTQ1NZFKpUilUszPz8scJnhjYxGbi9FoxGKxcODAAbq6usjlcoyNjTE5OVm9z3ClL+Bmo1KpcDqddHV14XA4UKlUTExMsLKyQiAQIBaLMTc3x1133YXL5cJgMODxeOjs7GRkZIRgMEgul6v0Mt5Ed3c3ra2t7N+/n87OTtrb26VjNRgMyl2/WCxSLBZpaGjA4/HQ1NQkZ6GJBD2xy7a2tpLP56tegMROv3PnTrq7u9m6dSvNzc1YLBZisRjZbJaVlRVCodA7Wg3VgsFgwGg00t/fj91ux2Kx4HA4cLvdclCmwWAgkUjINQq/lvB/eTweGhsbaW9vp7GxkcuXL1f9VNwNL0BqtZqWlhbuvfdeTCYT0WiU7373u/j9flZWVtBoNBiNRn7rt36LzZs3s2vXLlpbW7nzzjs5duwYfr9/ndO6WvjIRz7CT/3UTzE8PIzFYiGfz/P666/z/PPP853vfIeJiQmy2aw8cu3atYuhoSE+9alP0dLSQl1dHTqdDq1Wi81mw2KxcOedd2IwGBgbG6v08t4WYSH8xm/8Bh/4wAewWq1ybnw2myUajRIIBFhcXKy6+/ZWOJ1OGhsb+S//5b/Q09ODy+UCrlg5kUiEcDjMk08+id/v59KlS6ysrBCNRtFoNOj1etxuN319fezevZvNmzdjMBh4+eWXWVpaqmoR3tACJHbKVCpFIBDg0qVL+P1+AoEA8XhcmqWFQkGaucViEY1Gg81mkw5c8fVqQJjodXV1dHd3S9/VyZMneemll3jqqafw+/1ks1mKxSJarRatVksulyMWi0lhEo5pnU6HRqNBp9Ph9XpxOp2VXuI7IkY2d3d3o9frSSQShEIhLl26xPT0NHNzc6ysrNTM0UulUrFr1y72799Pb2+vDJQIQY3FYlJw0uk0Op1ORvTMZrMMKvT19TE8PIzRaCQejzM5OYnf76/qz2BDCxBcublCgI4ePcrk5KR8OMWfazQamStSKpXkDisiCdWUYKbRaDCZTNI/UCqViMfjnDlzhhMnTvDaa6/JB074B3Q6Hfl8nlgsxvT0NIVCgfb2dlQqFUajUR7DPB4PNputwit8Z9rb2/ngBz8oj5Orq6sEAgGOHz/O2NgYMzMzRKPRqtk03g7x/PX393PgwAF8Ph8mk4lSqUQulyOTyRAOhwkGg1KA1j6PBoMBp9PJ0NAQmzZtore3l6WlJdLpNIFAgJWVlQqu7p3Z8AJUKpUYGRlhcXGRxcVFksnkugdz79697Nixg8ceewyfz4fZbGZ5eZmRkRHm5uYIh8NV5cDr6enh537u59i5cyfFYpFz584xMjLCn/7pnxIKhdbtdjqdDoPBgMViwe/3Mz09zYkTJ9Dr9XzlK1/hwx/+MI8//jitra2YTCb27dtHJBJBpVJV7a6pUqloaWnh7rvvxmazkcvlOHLkCKdPn+bb3/629H0J/1e1Y7PZaGtro6uri5aWFjQaDSsrK7z88sucPHmSl19+mUgkQiaTIRaLSZ9eoVCgVCrhcDgwm83s2rULj8eDXq/nhRde4OTJkywvL5NOpyu9xLdlQwuQCF/G43FyuRzJZJJ8Pg+84Ufo7OxkeHgYn8+H2+2mVCqRTqfx+/0kEomqy4txOp1s27YNr9dLqVTi4sWLUiyvdpaXy2VKpRKFQmGdUxre8I2NjIxIp6fL5cJut1e1AAkL0Ol0olaryefzRKNRQqEQi4uL6yrdq3UNAhEyHxoaoqGhAYPBwOrqKouLi5w5c4azZ89y7tw5mS5xNcJV4PF48Hg8WCwWyuUyCwsLTE1Nkc1mq2rzvBYbWoAA+QKKl1GYrzabjY6ODj7wgQ/w2GOPodfrZaJeIBDg8OHDBAKBdQls1UBdXR0HDhzAYDCQzWb5i7/4C06dOnXNSF0+nyefz1+z3KBYLMpQ9V/91V/R1NSE2WzGZDKh1WrlDltNqNVqDAYDOp1uXfhZ5GuJa65mp6tAZKwPDQ3x27/923g8HtRqNYcOHeLs2bP87d/+rczifivUajV33303w8PDtLS0kE6nCYfDjI+Pc/78+ZrI59rwAqRSqbBarbjdblKpFIVCAY1GIyNj7e3t8oEWIpVIJFhYWCCbzVb68iUajYbGxka5U4qoz9VZvtfirV7GTCYjzftCoYBer8flcrFjxw6mpqZYWlq6GUt5zwh/VqlUIpVKYTQaZZ6Xx+PBZDLVxNHLbDZjsVjYtWsXu3fvxul0kkwmWV5e5vXXX+f8+fPvKD4iutnf309/fz8Gg4FwOMzCwgKJRKIqI7fX4rYQIJfLRU9PD6FQiHw+Lws1P/GJT9DZ2bmuUrpYLBKLxfD7/VWV/6PVaunp6aGlpQWtVksmkyGTyZDP59/zC5fL5SgUCqTTaXK5nMwlueeee8jlclUrQOIeifvW0tLC7OwsFouFQqFQVfftalQqFTabjcbGRj7xiU/IcpmFhQUmJiZ44YUXmJqaekfrRa1Wo9Pp2L59O9u2bUOv15NMJpmZmSGRSNSE9QMbXIDUajVGo5HBwUEeffRRcrmcjHI1NDTQ2toqc0gAYrEYJ0+eZHJysupCuFarlc9+9rMMDg6iUqn41re+xZNPPsnc3Nz7+nfL5TKpVIpEIoHZbGZ1dZUjR46wsLBwg678xiHKESKRCFNTU3R0dGCxWHA6nTQ3NzMwMMClS5dIJpOVvtQ3ISKOZrOZBx98kP7+fvr6+mSA4Pnnn+fIkSMEg8Hr2lB6enro7e3F6/ViMBjI5/NMTEzwwx/+sOo2jrdjQwuQVqvF6XTS2trK5s2b1wmKzWbDbrej0+mkszqTyTA9PU0oFKo6J6ZWq2XTpk20tLRQLpe5ePEihw4dIhaLve9/e63/JJlMMjc3RzwevwFXfeMplUpks1kZIDAajej1esxmMw6HA4PBUOlLvCZqtVoWQG/atImhoSE8Hg/FYpFwOMzc3BwTExPkcjmZ4yOey7XPoQjbC8G1WCzAlc1zYWGB8fFxEolEpZb5rtnQAuT1evn0pz/N3r17GRoakk5Vkb4ufD9wxWEbiUQ4efIks7OzVSU+cOW4dOzYMVKpFHfeeSfxePyGpQgIn0QmkyEajbKwsFC1JnypVJLtRsTmoVKpyOVyBIPBqgw7a7VaHA4Hu3fv5sEHH2Tnzp3U19cDEIlEGBsbo1Ao4PF45BFSo9GQzWZlSoG4HxaLhaamJh599FF+8id/Eq/XSygU4l/+5V84fPgwly5dqtp7dy02pACJXJHu7m62bdtGW1ubdNyWy2XpOxD/FQ+xXq/H6/XKcGY1kc/nGRkZQaPR0NfXRyqVuiHis7Zpu7CEqu34KRD3yGQyYbfbMRqN6/ociSinqG0TvxfWbCWiY8JZ7HK5aGpqoru7G7fbjdlsJpfLyaTSgYEBnE6nzNAXKQbZbJZMJkMul2N5eRmTyURDQwPt7e243W40Gg2ZTEa2HxFpJrXChhQgjUbDPffcw44dO/joRz+KwWBY52i++iEUD6rD4WDv3r0EAoGqy4VJJpN861vfYm5ujqGhIWKx2Pu+xqunRlR73ohGo8Fut+P1emlubpbHrXg8LstJtFqttGxFzpCwKt4psnSzMBqNdHZ2Mjg4yI4dOzAajajVapLJJAaDgaamJvbv349Wq5W9qdY+r7lcjlQqxQsvvECpVMJms9Hb24tOpwOuHL9eeuklFhcXb/na3i8bRoDEA/fAAw+wa9cu9u7di8/nk/k9axHh9mKxuO7PRMJeNVYQl8tl0uk0yWSSWCzGHXfcgcfj4Tvf+c410+3XdsR7K0TyodVqRa/XUy6X0Wqr95EwGAy0t7fLnBm4YvWIYk3hDzOZTFKMRC3f2hywW0m5XCabzTI3N0cgEGB5eZmGhgZMJhN6vV5Wsut0OnnNa3Oc4MoRzmAw0NLSIlNDVCqVrPdLJBKsrq6SSqVu+freL9X7tL0LhD/HbDazb98+fuqnforOzk5MJtO67xNHMHHMEGdtIV65XI5EIlF1yYeCbDYrfw0NDTEwMMCLL75IJBJZV9smihVFRvBb7fpWq1WWn6xtWVut6PV6Wltbcblc65IQQ6EQq6ursv5Lr9fLhEWLxUKxWKxoZCyfzxMMBmU5kNFoBN4olha/RCIsrHcPiK+5XC6ZfqFSqWQKRTweJx6PV1Xe2vVS8wKk1+sZGBhg+/btfPrTn6ajo0Mm6wmE8EQiERKJBOPj40SjUZaXl2XvFIvFwsLCAk8//TTnz5+v4IreHpfLxR133IHJZKJcLvPII48wMjLCCy+8IHfThx56iNbWVqLRKFNTUzz77LPXtITuvPNOfu3Xfo2BgQEKhQKnT59mfHy86qw/uPKyejwefvZnf5aOjg5ZhOr3+/n93/995ubm1jWPE0W2qVSKdDpd0dyYYrFINBrlueeeY2RkBKfTidVqpa+vD6vVit1ux+12Y7fbZa8mu90u/34kEpHlGHa7na6uLnQ6HYVCgSNHjnDmzBlpDdUaNStAKpUKu92O3W5ncHCQrVu3Mjw8LE1buLI7xuNxYrEYkUiEUChEIpFgdnZW7iQul0sWNQaDQebm5ohEIpVd3NsgIipGo5FyuUx9fT1ut/tNo4eF4/XtxMTpdNLT04PFYiGXy3Hu3DmmpqZuxTLeEwaDgba2NjweDxqNRibvTUxMEAqF1iUgarVaecwWL2+lhFU4wFdXV4nFYphMJkwmE9lsFovFgs1mw+l0YrPZpOUmmueVy2UpniICBleOnvl8nrm5Oebn56vad/d21KQAiV65e/bsYWBggF/6pV+ivr5+3a4h8iuOHDnCiy++yJNPPkk8HqdYLNLc3Ex7ezv79u2TPo/x8XEmJia4cOFCVSayvRXCdBcO1nK5zL/+67+uM+mv9eKJxDiHw4FGo2F1dZX//b//N36/vwKruD70ej0tLS1YLBZUKhVPPPEEhw8fJhgMvqnpvFh3pcVHsPboL0pgRMLg2n7dgrXHM7vdjtVqZfv27QwNDclkVIDXX3+dkZERRYBuBSIMOzQ0xJ49exgcHKS5uRmHw4Fer5cmaLlcJhAIMD8/z9GjR7l48SKxWEzuLHfddRdtbW1s27YNt9uNxWKRka9qS0C8mkKhQDQalZ/F2rC5uO61/qBrodFosFgs8pgidlPR5rMaEY3V1s54F/6PtS+f8GMJAa5WhFX0doi1WCwW1Go1Ho8Hp9O5rnBaZFfXKjUlQOLFufvuu/mv//W/YrFYZF2UWq2WL16xWGRycpILFy7w7LPPEg6HyWazeL1eWltb+cQnPoHP56OlpYVisUgul2N8fHxdzki1ks1mCYfD8qgpKt7fzXwzkZdis9nQarWyGVuxWKzanVSn08mokTiaiByZqzOF186Bq2XEurRaLSaTidbWVhobG+Xxu1gsygkYtbrWmhEglUpFR0cHv//7v8+mTZtkEha8cR4Oh8MymU5Mffj3//7fr0tes1qt9Pb2YjAY0Gg0xONxVldXOXToEKOjo1U/C+zcuXP85m/+Jv/23/5bPvrRj3Lfffdht9v5/ve//47XrdFo6O7upru7m89+9rP09vai1WqJx+NEIpGqFR9Yn1YgXsy9e/ei1+tZWlqSzbpETpNIK6jmzeR6EQmMe/bsob6+XvqP0um07Nwpop61tt6aEiCHw8H999+P3W6XUa5yuYxGo5HTL8WRRDjzOjs7sdvtuFwumTkrWllEo1EWFxdZWFhgbGxMtiut5hcxFArx4osv8vGPfxyAtrY2wuEwbrebaDT6lrkgFosFi8VCd3c3mzdv5sCBA5hMJtRqtUzSq+aHd23ulshyFpNAGhoa0Gq1cu3CCroeR3w1I45gHo+H5uZmmpub5RFMHJXX5g7VIjUlQCKzVWSArv26Xq/HZrPJh83n80mfgXDmwRsJfefOnePv//7vOXv2LBMTE8RisXUzl6oVUYyZy+XI5/N4vV62bdvG7/7u73Lw4EG++93vrvt+UQrw8Y9/nAMHDnD33Xfj9Xpl50OgJgQok8lIn49arcZkMrF582ZZET89Pc3rr7/OzMyM7Im8djOptsz260GkVdx5553s27ePxsZGGeEVlr4YniCO0tVsvV+LmhEgsQNeLRDiLJxOpwkGg8RiMdLpNPX19dJBJ/JAFhcXSSQSpNNpJicnGR0dZXp6mmAwWPXCczWXLl3i8OHD7N27F7PZzODgoJyCOT8/TzKZxG6343A4aGlpkQW5TU1NMookHuJAIMDMzExVP7xi45iamqK1tRWfzyet4K6uLrRarZwFFggE1vnGak14BGsd6msTS0XDepF2sDaZsZrv4bWoGQESGb2iiZbYCYQzcnl5mRMnTnD+/HlZL+V2u2lpacHv9zMxMcETTzxRlb1+3gvf/e53OXr0KP/4j//IwMAA+/fvp6uri7vuuot//Md/5PLly2zZsoWhoSEefvhhrFar7CAoxFy0tXj11VflXPhqRSSSPvPMM+zbt0+OnjaZTAwODuJyuSgUCkxMTBCPx2VnxFr0i6xFTHWJxWKyp7loyCb8XiL7XbghaomaESC4YoZfvHiR9vZ2WltbZf+agwcPMjY2xuHDh2Wy4cTEBCaTCavVSiKRkP6eWugXfD3E43HK5TJ/9Vd/xZYtW/jpn/5pzGYzXV1dfOpTnyIWi+HxeGSt19WtR5aXlxkbG+PUqVMcOnSIqampqu4kCFfWfPjwYZxOJ3v37l3nfI3H44yPjxMKheTRq9bFx2w2y8r5crnM+Pg4mUyGUChEJBIhHo8zMzNDKBSSfqFaO2rWnABNTEyg0+lk75RIJMKrr77KyMgIBw8erLmj1Hslk8mQzWb5wQ9+wPT0NA8++KB0Vvp8vjdlRot0g2KxKOeknTlzhh//+MeMjo6ysrJS9eZ7KpXizJkzbN68mdXVVWw2GzqdTqYmzMzMyLq4WhcfkSjq8XikdTM1NUU4HGZ6eppIJCILpxOJhIz+1ZoAqcpVeLVv5dHX6/U0NzdjtVpxOBwycW5xcZF0On1DugPeCK73I70RkQudTofT6WTPnj08/PDD/OIv/uK6JDyRefvcc8/x+uuvc/z4cSKRCNFoVM4ZFw7bd/so3Mp1CsQAxcbGRmnZiXKFhYUF4vE4qVTqhvp+bvU6RZJpT08PW7dulVNKRkZGSKVSJJNJOQlEXN/aeWHvNZ+rElJQUxZQLpdjenoarVYrTc5qTp67FYh0gnPnztHd3c2FCxdkdfvq6qqMcJ04cYKTJ09y6tQpotFoVft73o5iscjy8jLBYHBdRLRQKMh+RrX+PIjIpbBmRJ7T7OysjIACsrme+P5rtXCtdmrKAqoVKmEZqFQqGhsbaWtrY2BgAIPBwHe+8x2SyaTMn7nRfpFKrLMS3Op1ioJjMXRwfn5eOtZv5uuqWEAK75ly+coE2Pn5eQqFAlqtllgsVrOWzu2MiFCKVIlqbZJ3I1AsoJuAYhmsR1nnu0eUGQG3zKGuWEAKCgoA6/xYVWgj3DAUAVJQqEI2suispSqPYAoKCrcH6nf+FgUFBYWbgyJACgoKFUMRIAUFhYqhCJCCgkLFUARIQUGhYigCpKCgUDEUAVJQUKgYigApKChUDEWAFBQUKoYiQAoKChVDESAFBYWKoQiQgoJCxVAESEFBoWIoAqSgoFAxFAFSUFCoGIoAKSgoVAxFgBQUFCqGIkAKCgoVQxEgBQWFiqEIkIKCQsVQBEhBQaFiKAKkoKBQMRQBUlBQqBiKACkoKFQMRYAUFBQqhiJACgoKFUMRIAUFhYqhCJCCgkLFUARIQUGhYigCpKCgUDEUAVJQUKgYigApKChUDEWAFBQUKoYiQAoKChVDESAFBYWK8f8DkQmhNtIvGcEAAAAASUVORK5CYII=\n",
|
| 785 |
-
"text/plain": [
|
| 786 |
-
"<PIL.PngImagePlugin.PngImageFile image mode=RGBA size=288x288 at 0x7F8AECFBC7F0>"
|
| 787 |
-
]
|
| 788 |
-
},
|
| 789 |
-
"execution_count": 26,
|
| 790 |
-
"metadata": {},
|
| 791 |
-
"output_type": "execute_result"
|
| 792 |
-
}
|
| 793 |
-
],
|
| 794 |
-
"source": [
|
| 795 |
-
"display_image(EPOCHS)"
|
| 796 |
-
]
|
| 797 |
-
},
|
| 798 |
-
{
|
| 799 |
-
"cell_type": "markdown",
|
| 800 |
-
"metadata": {
|
| 801 |
-
"colab_type": "text",
|
| 802 |
-
"id": "NywiH3nL8guF"
|
| 803 |
-
},
|
| 804 |
-
"source": [
|
| 805 |
-
"Use `imageio` to create an animated gif using the images saved during training."
|
| 806 |
-
]
|
| 807 |
-
},
|
| 808 |
-
{
|
| 809 |
-
"cell_type": "code",
|
| 810 |
-
"execution_count": 27,
|
| 811 |
-
"metadata": {
|
| 812 |
-
"colab": {},
|
| 813 |
-
"colab_type": "code",
|
| 814 |
-
"id": "IGKQgENQ8lEI"
|
| 815 |
-
},
|
| 816 |
-
"outputs": [],
|
| 817 |
-
"source": [
|
| 818 |
-
"anim_file = 'dcgan.gif'\n",
|
| 819 |
-
"\n",
|
| 820 |
-
"with imageio.get_writer(anim_file, mode='I') as writer:\n",
|
| 821 |
-
" filenames = glob.glob('image*.png')\n",
|
| 822 |
-
" filenames = sorted(filenames)\n",
|
| 823 |
-
" last = -1\n",
|
| 824 |
-
" for i,filename in enumerate(filenames):\n",
|
| 825 |
-
" frame = 2*(i**0.5)\n",
|
| 826 |
-
" if round(frame) > round(last):\n",
|
| 827 |
-
" last = frame\n",
|
| 828 |
-
" else:\n",
|
| 829 |
-
" continue\n",
|
| 830 |
-
" image = imageio.imread(filename)\n",
|
| 831 |
-
" writer.append_data(image)\n",
|
| 832 |
-
" image = imageio.imread(filename)\n",
|
| 833 |
-
" writer.append_data(image)\n",
|
| 834 |
-
"\n",
|
| 835 |
-
"import IPython\n",
|
| 836 |
-
"if IPython.version_info > (6,2,0,''):\n",
|
| 837 |
-
" display.Image(filename=anim_file)"
|
| 838 |
-
]
|
| 839 |
-
},
|
| 840 |
-
{
|
| 841 |
-
"cell_type": "markdown",
|
| 842 |
-
"metadata": {
|
| 843 |
-
"colab_type": "text",
|
| 844 |
-
"id": "cGhC3-fMWSwl"
|
| 845 |
-
},
|
| 846 |
-
"source": [
|
| 847 |
-
"If you're working in Colab you can download the animation with the code below:"
|
| 848 |
-
]
|
| 849 |
-
},
|
| 850 |
-
{
|
| 851 |
-
"cell_type": "code",
|
| 852 |
-
"execution_count": 28,
|
| 853 |
-
"metadata": {
|
| 854 |
-
"colab": {},
|
| 855 |
-
"colab_type": "code",
|
| 856 |
-
"id": "uV0yiKpzNP1b"
|
| 857 |
-
},
|
| 858 |
-
"outputs": [],
|
| 859 |
-
"source": [
|
| 860 |
-
"try:\n",
|
| 861 |
-
" from google.colab import files\n",
|
| 862 |
-
"except ImportError:\n",
|
| 863 |
-
" pass\n",
|
| 864 |
-
"else:\n",
|
| 865 |
-
" files.download(anim_file)"
|
| 866 |
-
]
|
| 867 |
-
},
|
| 868 |
-
{
|
| 869 |
-
"cell_type": "markdown",
|
| 870 |
-
"metadata": {
|
| 871 |
-
"colab_type": "text",
|
| 872 |
-
"id": "k6qC-SbjK0yW"
|
| 873 |
-
},
|
| 874 |
-
"source": [
|
| 875 |
-
"## Next steps\n"
|
| 876 |
-
]
|
| 877 |
-
},
|
| 878 |
-
{
|
| 879 |
-
"cell_type": "markdown",
|
| 880 |
-
"metadata": {
|
| 881 |
-
"colab_type": "text",
|
| 882 |
-
"id": "xjjkT9KAK6H7"
|
| 883 |
-
},
|
| 884 |
-
"source": [
|
| 885 |
-
"This tutorial has shown the complete code necessary to write and train a GAN. As a next step, you might like to experiment with a different dataset, for example the Large-scale Celeb Faces Attributes (CelebA) dataset [available on Kaggle](https://www.kaggle.com/jessicali9530/celeba-dataset). To learn more about GANs we recommend the [NIPS 2016 Tutorial: Generative Adversarial Networks](https://arxiv.org/abs/1701.00160).\n"
|
| 886 |
-
]
|
| 887 |
-
}
|
| 888 |
-
],
|
| 889 |
-
"metadata": {
|
| 890 |
-
"accelerator": "GPU",
|
| 891 |
-
"colab": {
|
| 892 |
-
"collapsed_sections": [],
|
| 893 |
-
"name": "dcgan.ipynb",
|
| 894 |
-
"private_outputs": true,
|
| 895 |
-
"provenance": [],
|
| 896 |
-
"toc_visible": true
|
| 897 |
-
},
|
| 898 |
-
"kernelspec": {
|
| 899 |
-
"display_name": "Python 3",
|
| 900 |
-
"language": "python",
|
| 901 |
-
"name": "python3"
|
| 902 |
-
},
|
| 903 |
-
"language_info": {
|
| 904 |
-
"codemirror_mode": {
|
| 905 |
-
"name": "ipython",
|
| 906 |
-
"version": 3
|
| 907 |
-
},
|
| 908 |
-
"file_extension": ".py",
|
| 909 |
-
"mimetype": "text/x-python",
|
| 910 |
-
"name": "python",
|
| 911 |
-
"nbconvert_exporter": "python",
|
| 912 |
-
"pygments_lexer": "ipython3",
|
| 913 |
-
"version": "3.7.4"
|
| 914 |
-
}
|
| 915 |
-
},
|
| 916 |
-
"nbformat": 4,
|
| 917 |
-
"nbformat_minor": 1
|
| 918 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:759d8be78ea02a4d58803b933fa469d1735368195f60434e29491001b0aee55c
|
| 3 |
+
size 70339
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25. Face Recognition/.ipynb_checkpoints/25.0 Face Extraction from Video - Build Dataset-checkpoint.ipynb
CHANGED
|
@@ -1,93 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Extracting the faces from a video"
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "code",
|
| 12 |
-
"execution_count": null,
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"outputs": [],
|
| 15 |
-
"source": [
|
| 16 |
-
"from os import listdir\n",
|
| 17 |
-
"from os.path import isfile, join\n",
|
| 18 |
-
"import os\n",
|
| 19 |
-
"import cv2\n",
|
| 20 |
-
"import dlib\n",
|
| 21 |
-
"import numpy as np\n",
|
| 22 |
-
"\n",
|
| 23 |
-
"# Define Image Path Here\n",
|
| 24 |
-
"image_path = \"./images/\"\n",
|
| 25 |
-
"\n",
|
| 26 |
-
"def draw_label(image, point, label, font=cv2.FONT_HERSHEY_SIMPLEX,\n",
|
| 27 |
-
" font_scale=0.8, thickness=1):\n",
|
| 28 |
-
" size = cv2.getTextSize(label, font, font_scale, thickness)[0]\n",
|
| 29 |
-
" x, y = point\n",
|
| 30 |
-
" cv2.rectangle(image, (x, y - size[1]), (x + size[0], y), (255, 0, 0), cv2.FILLED)\n",
|
| 31 |
-
" cv2.putText(image, label, point, font, font_scale, (255, 255, 255), thickness, lineType=cv2.LINE_AA)\n",
|
| 32 |
-
" \n",
|
| 33 |
-
"detector = dlib.get_frontal_face_detector()\n",
|
| 34 |
-
"\n",
|
| 35 |
-
"# Initialize Webcam\n",
|
| 36 |
-
"cap = cv2.VideoCapture('testfriends.mp4')\n",
|
| 37 |
-
"img_size = 64\n",
|
| 38 |
-
"margin = 0.2\n",
|
| 39 |
-
"frame_count = 0\n",
|
| 40 |
-
"\n",
|
| 41 |
-
"while True:\n",
|
| 42 |
-
" ret, frame = cap.read()\n",
|
| 43 |
-
" frame_count += 1\n",
|
| 44 |
-
" print(frame_count) \n",
|
| 45 |
-
" \n",
|
| 46 |
-
" input_img = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n",
|
| 47 |
-
" img_h, img_w, _ = np.shape(input_img)\n",
|
| 48 |
-
" detected = detector(frame, 1)\n",
|
| 49 |
-
" faces = []\n",
|
| 50 |
-
" \n",
|
| 51 |
-
" if len(detected) > 0:\n",
|
| 52 |
-
" for i, d in enumerate(detected):\n",
|
| 53 |
-
" x1, y1, x2, y2, w, h = d.left(), d.top(), d.right() + 1, d.bottom() + 1, d.width(), d.height()\n",
|
| 54 |
-
" xw1 = max(int(x1 - margin * w), 0)\n",
|
| 55 |
-
" yw1 = max(int(y1 - margin * h), 0)\n",
|
| 56 |
-
" xw2 = min(int(x2 + margin * w), img_w - 1)\n",
|
| 57 |
-
" yw2 = min(int(y2 + margin * h), img_h - 1)\n",
|
| 58 |
-
" face = frame[yw1:yw2 + 1, xw1:xw2 + 1, :]\n",
|
| 59 |
-
" file_name = \"./faces/\"+str(frame_count)+\"_\"+str(i)+\".jpg\"\n",
|
| 60 |
-
" cv2.imwrite(file_name, face)\n",
|
| 61 |
-
" cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0), 2)\n",
|
| 62 |
-
"\n",
|
| 63 |
-
" cv2.imshow(\"Face Detector\", frame)\n",
|
| 64 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 65 |
-
" break\n",
|
| 66 |
-
"\n",
|
| 67 |
-
"cap.release()\n",
|
| 68 |
-
"cv2.destroyAllWindows() "
|
| 69 |
-
]
|
| 70 |
-
}
|
| 71 |
-
],
|
| 72 |
-
"metadata": {
|
| 73 |
-
"kernelspec": {
|
| 74 |
-
"display_name": "Python 3",
|
| 75 |
-
"language": "python",
|
| 76 |
-
"name": "python3"
|
| 77 |
-
},
|
| 78 |
-
"language_info": {
|
| 79 |
-
"codemirror_mode": {
|
| 80 |
-
"name": "ipython",
|
| 81 |
-
"version": 3
|
| 82 |
-
},
|
| 83 |
-
"file_extension": ".py",
|
| 84 |
-
"mimetype": "text/x-python",
|
| 85 |
-
"name": "python",
|
| 86 |
-
"nbconvert_exporter": "python",
|
| 87 |
-
"pygments_lexer": "ipython3",
|
| 88 |
-
"version": "3.7.4"
|
| 89 |
-
}
|
| 90 |
-
},
|
| 91 |
-
"nbformat": 4,
|
| 92 |
-
"nbformat_minor": 2
|
| 93 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:2b36b2231a5feaa366136c6aa0dc9026c9ec3b2b83bce6e52eaa64a462fc9786
|
| 3 |
+
size 2867
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25. Face Recognition/.ipynb_checkpoints/25.1 Face Recognition - Friends Characters - Train and Test-checkpoint.ipynb
CHANGED
|
@@ -1,536 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Basic Deep Learning Face Recogntion\n",
|
| 8 |
-
"## Building a Friends TV Show Character Identifier"
|
| 9 |
-
]
|
| 10 |
-
},
|
| 11 |
-
{
|
| 12 |
-
"cell_type": "markdown",
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"source": [
|
| 15 |
-
"## The learning objective of this lesson (25.1) is the create a 'dumb' face classifer using our LittleVGG model. We are simply training it with 100s of pictures of each Friends Character, and testing our model using a Test Video. \n",
|
| 16 |
-
"\n",
|
| 17 |
-
"## You will see how this is an in-effective way to do Face Recognition, why?\n",
|
| 18 |
-
"## Because a traditional N"
|
| 19 |
-
]
|
| 20 |
-
},
|
| 21 |
-
{
|
| 22 |
-
"cell_type": "markdown",
|
| 23 |
-
"metadata": {},
|
| 24 |
-
"source": [
|
| 25 |
-
"### Let's train our model\n",
|
| 26 |
-
"I've created a dataset with the faces of 4 Friends characters taken from a handful of different scenes."
|
| 27 |
-
]
|
| 28 |
-
},
|
| 29 |
-
{
|
| 30 |
-
"cell_type": "code",
|
| 31 |
-
"execution_count": 33,
|
| 32 |
-
"metadata": {},
|
| 33 |
-
"outputs": [
|
| 34 |
-
{
|
| 35 |
-
"name": "stdout",
|
| 36 |
-
"output_type": "stream",
|
| 37 |
-
"text": [
|
| 38 |
-
"Found 2663 images belonging to 4 classes.\n",
|
| 39 |
-
"Found 955 images belonging to 4 classes.\n"
|
| 40 |
-
]
|
| 41 |
-
}
|
| 42 |
-
],
|
| 43 |
-
"source": [
|
| 44 |
-
"from __future__ import print_function\n",
|
| 45 |
-
"import keras\n",
|
| 46 |
-
"from keras.preprocessing.image import ImageDataGenerator\n",
|
| 47 |
-
"from keras.models import Sequential\n",
|
| 48 |
-
"from keras.layers import Dense, Dropout, Activation, Flatten, BatchNormalization\n",
|
| 49 |
-
"from keras.layers import Conv2D, MaxPooling2D\n",
|
| 50 |
-
"from keras.preprocessing.image import ImageDataGenerator\n",
|
| 51 |
-
"import os\n",
|
| 52 |
-
"\n",
|
| 53 |
-
"num_classes = 4\n",
|
| 54 |
-
"img_rows, img_cols = 48, 48\n",
|
| 55 |
-
"batch_size = 16\n",
|
| 56 |
-
"\n",
|
| 57 |
-
"train_data_dir = './faces/train'\n",
|
| 58 |
-
"validation_data_dir = './faces/validation'\n",
|
| 59 |
-
"\n",
|
| 60 |
-
"# Let's use some data augmentaiton \n",
|
| 61 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 62 |
-
" rescale=1./255,\n",
|
| 63 |
-
" rotation_range=30,\n",
|
| 64 |
-
" shear_range=0.3,\n",
|
| 65 |
-
" zoom_range=0.3,\n",
|
| 66 |
-
" width_shift_range=0.4,\n",
|
| 67 |
-
" height_shift_range=0.4,\n",
|
| 68 |
-
" horizontal_flip=True,\n",
|
| 69 |
-
" fill_mode='nearest')\n",
|
| 70 |
-
" \n",
|
| 71 |
-
"validation_datagen = ImageDataGenerator(rescale=1./255)\n",
|
| 72 |
-
" \n",
|
| 73 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 74 |
-
" train_data_dir,\n",
|
| 75 |
-
" target_size=(img_rows, img_cols),\n",
|
| 76 |
-
" batch_size=batch_size,\n",
|
| 77 |
-
" class_mode='categorical',\n",
|
| 78 |
-
" shuffle=True)\n",
|
| 79 |
-
" \n",
|
| 80 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 81 |
-
" validation_data_dir,\n",
|
| 82 |
-
" target_size=(img_rows, img_cols),\n",
|
| 83 |
-
" batch_size=batch_size,\n",
|
| 84 |
-
" class_mode='categorical',\n",
|
| 85 |
-
" shuffle=True)"
|
| 86 |
-
]
|
| 87 |
-
},
|
| 88 |
-
{
|
| 89 |
-
"cell_type": "code",
|
| 90 |
-
"execution_count": 37,
|
| 91 |
-
"metadata": {},
|
| 92 |
-
"outputs": [],
|
| 93 |
-
"source": [
|
| 94 |
-
"#Our Keras imports\n",
|
| 95 |
-
"from keras.models import Sequential\n",
|
| 96 |
-
"from keras.layers.normalization import BatchNormalization\n",
|
| 97 |
-
"from keras.layers.convolutional import Conv2D, MaxPooling2D\n",
|
| 98 |
-
"from keras.layers.advanced_activations import ELU\n",
|
| 99 |
-
"from keras.layers.core import Activation, Flatten, Dropout, Dense"
|
| 100 |
-
]
|
| 101 |
-
},
|
| 102 |
-
{
|
| 103 |
-
"cell_type": "markdown",
|
| 104 |
-
"metadata": {},
|
| 105 |
-
"source": [
|
| 106 |
-
"### Creating a simple VGG based model for Face Recognition"
|
| 107 |
-
]
|
| 108 |
-
},
|
| 109 |
-
{
|
| 110 |
-
"cell_type": "code",
|
| 111 |
-
"execution_count": 35,
|
| 112 |
-
"metadata": {},
|
| 113 |
-
"outputs": [
|
| 114 |
-
{
|
| 115 |
-
"name": "stdout",
|
| 116 |
-
"output_type": "stream",
|
| 117 |
-
"text": [
|
| 118 |
-
"_________________________________________________________________\n",
|
| 119 |
-
"Layer (type) Output Shape Param # \n",
|
| 120 |
-
"=================================================================\n",
|
| 121 |
-
"conv2d_25 (Conv2D) (None, 48, 48, 32) 896 \n",
|
| 122 |
-
"_________________________________________________________________\n",
|
| 123 |
-
"activation_34 (Activation) (None, 48, 48, 32) 0 \n",
|
| 124 |
-
"_________________________________________________________________\n",
|
| 125 |
-
"batch_normalization_31 (Batc (None, 48, 48, 32) 128 \n",
|
| 126 |
-
"_________________________________________________________________\n",
|
| 127 |
-
"conv2d_26 (Conv2D) (None, 48, 48, 32) 9248 \n",
|
| 128 |
-
"_________________________________________________________________\n",
|
| 129 |
-
"activation_35 (Activation) (None, 48, 48, 32) 0 \n",
|
| 130 |
-
"_________________________________________________________________\n",
|
| 131 |
-
"batch_normalization_32 (Batc (None, 48, 48, 32) 128 \n",
|
| 132 |
-
"_________________________________________________________________\n",
|
| 133 |
-
"max_pooling2d_13 (MaxPooling (None, 24, 24, 32) 0 \n",
|
| 134 |
-
"_________________________________________________________________\n",
|
| 135 |
-
"dropout_19 (Dropout) (None, 24, 24, 32) 0 \n",
|
| 136 |
-
"_________________________________________________________________\n",
|
| 137 |
-
"conv2d_27 (Conv2D) (None, 24, 24, 64) 18496 \n",
|
| 138 |
-
"_________________________________________________________________\n",
|
| 139 |
-
"activation_36 (Activation) (None, 24, 24, 64) 0 \n",
|
| 140 |
-
"_________________________________________________________________\n",
|
| 141 |
-
"batch_normalization_33 (Batc (None, 24, 24, 64) 256 \n",
|
| 142 |
-
"_________________________________________________________________\n",
|
| 143 |
-
"conv2d_28 (Conv2D) (None, 24, 24, 64) 36928 \n",
|
| 144 |
-
"_________________________________________________________________\n",
|
| 145 |
-
"activation_37 (Activation) (None, 24, 24, 64) 0 \n",
|
| 146 |
-
"_________________________________________________________________\n",
|
| 147 |
-
"batch_normalization_34 (Batc (None, 24, 24, 64) 256 \n",
|
| 148 |
-
"_________________________________________________________________\n",
|
| 149 |
-
"max_pooling2d_14 (MaxPooling (None, 12, 12, 64) 0 \n",
|
| 150 |
-
"_________________________________________________________________\n",
|
| 151 |
-
"dropout_20 (Dropout) (None, 12, 12, 64) 0 \n",
|
| 152 |
-
"_________________________________________________________________\n",
|
| 153 |
-
"conv2d_29 (Conv2D) (None, 12, 12, 128) 73856 \n",
|
| 154 |
-
"_________________________________________________________________\n",
|
| 155 |
-
"activation_38 (Activation) (None, 12, 12, 128) 0 \n",
|
| 156 |
-
"_________________________________________________________________\n",
|
| 157 |
-
"batch_normalization_35 (Batc (None, 12, 12, 128) 512 \n",
|
| 158 |
-
"_________________________________________________________________\n",
|
| 159 |
-
"conv2d_30 (Conv2D) (None, 12, 12, 128) 147584 \n",
|
| 160 |
-
"_________________________________________________________________\n",
|
| 161 |
-
"activation_39 (Activation) (None, 12, 12, 128) 0 \n",
|
| 162 |
-
"_________________________________________________________________\n",
|
| 163 |
-
"batch_normalization_36 (Batc (None, 12, 12, 128) 512 \n",
|
| 164 |
-
"_________________________________________________________________\n",
|
| 165 |
-
"max_pooling2d_15 (MaxPooling (None, 6, 6, 128) 0 \n",
|
| 166 |
-
"_________________________________________________________________\n",
|
| 167 |
-
"dropout_21 (Dropout) (None, 6, 6, 128) 0 \n",
|
| 168 |
-
"_________________________________________________________________\n",
|
| 169 |
-
"conv2d_31 (Conv2D) (None, 6, 6, 256) 295168 \n",
|
| 170 |
-
"_________________________________________________________________\n",
|
| 171 |
-
"activation_40 (Activation) (None, 6, 6, 256) 0 \n",
|
| 172 |
-
"_________________________________________________________________\n",
|
| 173 |
-
"batch_normalization_37 (Batc (None, 6, 6, 256) 1024 \n",
|
| 174 |
-
"_________________________________________________________________\n",
|
| 175 |
-
"conv2d_32 (Conv2D) (None, 6, 6, 256) 590080 \n",
|
| 176 |
-
"_________________________________________________________________\n",
|
| 177 |
-
"activation_41 (Activation) (None, 6, 6, 256) 0 \n",
|
| 178 |
-
"_________________________________________________________________\n",
|
| 179 |
-
"batch_normalization_38 (Batc (None, 6, 6, 256) 1024 \n",
|
| 180 |
-
"_________________________________________________________________\n",
|
| 181 |
-
"max_pooling2d_16 (MaxPooling (None, 3, 3, 256) 0 \n",
|
| 182 |
-
"_________________________________________________________________\n",
|
| 183 |
-
"dropout_22 (Dropout) (None, 3, 3, 256) 0 \n",
|
| 184 |
-
"_________________________________________________________________\n",
|
| 185 |
-
"flatten_4 (Flatten) (None, 2304) 0 \n",
|
| 186 |
-
"_________________________________________________________________\n",
|
| 187 |
-
"dense_10 (Dense) (None, 64) 147520 \n",
|
| 188 |
-
"_________________________________________________________________\n",
|
| 189 |
-
"activation_42 (Activation) (None, 64) 0 \n",
|
| 190 |
-
"_________________________________________________________________\n",
|
| 191 |
-
"batch_normalization_39 (Batc (None, 64) 256 \n",
|
| 192 |
-
"_________________________________________________________________\n",
|
| 193 |
-
"dropout_23 (Dropout) (None, 64) 0 \n",
|
| 194 |
-
"_________________________________________________________________\n",
|
| 195 |
-
"dense_11 (Dense) (None, 64) 4160 \n",
|
| 196 |
-
"_________________________________________________________________\n",
|
| 197 |
-
"activation_43 (Activation) (None, 64) 0 \n",
|
| 198 |
-
"_________________________________________________________________\n",
|
| 199 |
-
"batch_normalization_40 (Batc (None, 64) 256 \n",
|
| 200 |
-
"_________________________________________________________________\n",
|
| 201 |
-
"dropout_24 (Dropout) (None, 64) 0 \n",
|
| 202 |
-
"_________________________________________________________________\n",
|
| 203 |
-
"dense_12 (Dense) (None, 4) 260 \n",
|
| 204 |
-
"_________________________________________________________________\n",
|
| 205 |
-
"activation_44 (Activation) (None, 4) 0 \n",
|
| 206 |
-
"=================================================================\n",
|
| 207 |
-
"Total params: 1,328,548\n",
|
| 208 |
-
"Trainable params: 1,326,372\n",
|
| 209 |
-
"Non-trainable params: 2,176\n",
|
| 210 |
-
"_________________________________________________________________\n",
|
| 211 |
-
"None\n"
|
| 212 |
-
]
|
| 213 |
-
}
|
| 214 |
-
],
|
| 215 |
-
"source": [
|
| 216 |
-
"model = Sequential()\n",
|
| 217 |
-
"\n",
|
| 218 |
-
"model.add(Conv2D(32, (3, 3), padding = 'same', kernel_initializer=\"he_normal\",\n",
|
| 219 |
-
" input_shape = (img_rows, img_cols, 3)))\n",
|
| 220 |
-
"model.add(Activation('elu'))\n",
|
| 221 |
-
"model.add(BatchNormalization())\n",
|
| 222 |
-
"model.add(Conv2D(32, (3, 3), padding = \"same\", kernel_initializer=\"he_normal\", \n",
|
| 223 |
-
" input_shape = (img_rows, img_cols, 3)))\n",
|
| 224 |
-
"model.add(Activation('elu'))\n",
|
| 225 |
-
"model.add(BatchNormalization())\n",
|
| 226 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 227 |
-
"model.add(Dropout(0.2))\n",
|
| 228 |
-
"\n",
|
| 229 |
-
"# Block #2: second CONV => RELU => CONV => RELU => POOL\n",
|
| 230 |
-
"# layer set\n",
|
| 231 |
-
"model.add(Conv2D(64, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 232 |
-
"model.add(Activation('elu'))\n",
|
| 233 |
-
"model.add(BatchNormalization())\n",
|
| 234 |
-
"model.add(Conv2D(64, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 235 |
-
"model.add(Activation('elu'))\n",
|
| 236 |
-
"model.add(BatchNormalization())\n",
|
| 237 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 238 |
-
"model.add(Dropout(0.2))\n",
|
| 239 |
-
"\n",
|
| 240 |
-
"# Block #3: third CONV => RELU => CONV => RELU => POOL\n",
|
| 241 |
-
"# layer set\n",
|
| 242 |
-
"model.add(Conv2D(128, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 243 |
-
"model.add(Activation('elu'))\n",
|
| 244 |
-
"model.add(BatchNormalization())\n",
|
| 245 |
-
"model.add(Conv2D(128, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 246 |
-
"model.add(Activation('elu'))\n",
|
| 247 |
-
"model.add(BatchNormalization())\n",
|
| 248 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 249 |
-
"model.add(Dropout(0.2))\n",
|
| 250 |
-
"\n",
|
| 251 |
-
"# Block #4: third CONV => RELU => CONV => RELU => POOL\n",
|
| 252 |
-
"# layer set\n",
|
| 253 |
-
"model.add(Conv2D(256, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 254 |
-
"model.add(Activation('elu'))\n",
|
| 255 |
-
"model.add(BatchNormalization())\n",
|
| 256 |
-
"model.add(Conv2D(256, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 257 |
-
"model.add(Activation('elu'))\n",
|
| 258 |
-
"model.add(BatchNormalization())\n",
|
| 259 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 260 |
-
"model.add(Dropout(0.2))\n",
|
| 261 |
-
"\n",
|
| 262 |
-
"# Block #5: first set of FC => RELU layers\n",
|
| 263 |
-
"model.add(Flatten())\n",
|
| 264 |
-
"model.add(Dense(64, kernel_initializer=\"he_normal\"))\n",
|
| 265 |
-
"model.add(Activation('elu'))\n",
|
| 266 |
-
"model.add(BatchNormalization())\n",
|
| 267 |
-
"model.add(Dropout(0.5))\n",
|
| 268 |
-
"\n",
|
| 269 |
-
"# Block #6: second set of FC => RELU layers\n",
|
| 270 |
-
"model.add(Dense(64, kernel_initializer=\"he_normal\"))\n",
|
| 271 |
-
"model.add(Activation('elu'))\n",
|
| 272 |
-
"model.add(BatchNormalization())\n",
|
| 273 |
-
"model.add(Dropout(0.5))\n",
|
| 274 |
-
"\n",
|
| 275 |
-
"# Block #7: softmax classifier\n",
|
| 276 |
-
"model.add(Dense(num_classes, kernel_initializer=\"he_normal\"))\n",
|
| 277 |
-
"model.add(Activation(\"softmax\"))\n",
|
| 278 |
-
"\n",
|
| 279 |
-
"print(model.summary())"
|
| 280 |
-
]
|
| 281 |
-
},
|
| 282 |
-
{
|
| 283 |
-
"cell_type": "markdown",
|
| 284 |
-
"metadata": {},
|
| 285 |
-
"source": [
|
| 286 |
-
"### Training our Model"
|
| 287 |
-
]
|
| 288 |
-
},
|
| 289 |
-
{
|
| 290 |
-
"cell_type": "code",
|
| 291 |
-
"execution_count": 36,
|
| 292 |
-
"metadata": {},
|
| 293 |
-
"outputs": [
|
| 294 |
-
{
|
| 295 |
-
"name": "stdout",
|
| 296 |
-
"output_type": "stream",
|
| 297 |
-
"text": [
|
| 298 |
-
"Epoch 1/10\n",
|
| 299 |
-
"166/166 [==============================] - 76s 457ms/step - loss: 1.1153 - acc: 0.5700 - val_loss: 1.4428 - val_acc: 0.4841\n",
|
| 300 |
-
"\n",
|
| 301 |
-
"Epoch 00001: val_loss improved from inf to 1.44279, saving model to /home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5\n",
|
| 302 |
-
"Epoch 2/10\n",
|
| 303 |
-
"166/166 [==============================] - 67s 403ms/step - loss: 0.7034 - acc: 0.7343 - val_loss: 3.7705 - val_acc: 0.2705\n",
|
| 304 |
-
"\n",
|
| 305 |
-
"Epoch 00002: val_loss did not improve from 1.44279\n",
|
| 306 |
-
"Epoch 3/10\n",
|
| 307 |
-
"166/166 [==============================] - 62s 373ms/step - loss: 0.6037 - acc: 0.7690 - val_loss: 0.9403 - val_acc: 0.6912\n",
|
| 308 |
-
"\n",
|
| 309 |
-
"Epoch 00003: val_loss improved from 1.44279 to 0.94025, saving model to /home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5\n",
|
| 310 |
-
"Epoch 4/10\n",
|
| 311 |
-
"166/166 [==============================] - 62s 373ms/step - loss: 0.5432 - acc: 0.7988 - val_loss: 1.3018 - val_acc: 0.5548\n",
|
| 312 |
-
"\n",
|
| 313 |
-
"Epoch 00004: val_loss did not improve from 0.94025\n",
|
| 314 |
-
"Epoch 5/10\n",
|
| 315 |
-
"166/166 [==============================] - 69s 414ms/step - loss: 0.4715 - acc: 0.8301 - val_loss: 3.8879 - val_acc: 0.1534\n",
|
| 316 |
-
"\n",
|
| 317 |
-
"Epoch 00005: val_loss did not improve from 0.94025\n",
|
| 318 |
-
"Epoch 6/10\n",
|
| 319 |
-
"166/166 [==============================] - 77s 467ms/step - loss: 0.4233 - acc: 0.8524 - val_loss: 0.6878 - val_acc: 0.7093\n",
|
| 320 |
-
"\n",
|
| 321 |
-
"Epoch 00006: val_loss improved from 0.94025 to 0.68784, saving model to /home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5\n",
|
| 322 |
-
"Epoch 7/10\n",
|
| 323 |
-
"166/166 [==============================] - 71s 429ms/step - loss: 0.4130 - acc: 0.8636 - val_loss: 3.3402 - val_acc: 0.2971\n",
|
| 324 |
-
"\n",
|
| 325 |
-
"Epoch 00007: val_loss did not improve from 0.68784\n",
|
| 326 |
-
"Epoch 8/10\n",
|
| 327 |
-
"166/166 [==============================] - 79s 477ms/step - loss: 0.3821 - acc: 0.8748 - val_loss: 2.6729 - val_acc: 0.6283\n",
|
| 328 |
-
"\n",
|
| 329 |
-
"Epoch 00008: val_loss did not improve from 0.68784\n",
|
| 330 |
-
"Epoch 9/10\n",
|
| 331 |
-
"166/166 [==============================] - 86s 519ms/step - loss: 0.3622 - acc: 0.8709 - val_loss: 1.5067 - val_acc: 0.5197\n",
|
| 332 |
-
"Restoring model weights from the end of the best epoch\n",
|
| 333 |
-
"\n",
|
| 334 |
-
"Epoch 00009: val_loss did not improve from 0.68784\n",
|
| 335 |
-
"\n",
|
| 336 |
-
"Epoch 00009: ReduceLROnPlateau reducing learning rate to 0.0019999999552965165.\n",
|
| 337 |
-
"Epoch 00009: early stopping\n"
|
| 338 |
-
]
|
| 339 |
-
}
|
| 340 |
-
],
|
| 341 |
-
"source": [
|
| 342 |
-
"from keras.optimizers import RMSprop, SGD, Adam\n",
|
| 343 |
-
"from keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau\n",
|
| 344 |
-
"\n",
|
| 345 |
-
" \n",
|
| 346 |
-
"checkpoint = ModelCheckpoint(\"/home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5\",\n",
|
| 347 |
-
" monitor=\"val_loss\",\n",
|
| 348 |
-
" mode=\"min\",\n",
|
| 349 |
-
" save_best_only = True,\n",
|
| 350 |
-
" verbose=1)\n",
|
| 351 |
-
"\n",
|
| 352 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', \n",
|
| 353 |
-
" min_delta = 0, \n",
|
| 354 |
-
" patience = 3,\n",
|
| 355 |
-
" verbose = 1,\n",
|
| 356 |
-
" restore_best_weights = True)\n",
|
| 357 |
-
"\n",
|
| 358 |
-
"reduce_lr = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.2, patience = 3, verbose = 1, min_delta = 0.0001)\n",
|
| 359 |
-
"\n",
|
| 360 |
-
"# we put our call backs into a callback list\n",
|
| 361 |
-
"callbacks = [earlystop, checkpoint, reduce_lr]\n",
|
| 362 |
-
"\n",
|
| 363 |
-
"# We use a very small learning rate \n",
|
| 364 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 365 |
-
" optimizer = Adam(lr=0.01),\n",
|
| 366 |
-
" metrics = ['accuracy'])\n",
|
| 367 |
-
"\n",
|
| 368 |
-
"nb_train_samples = 2663\n",
|
| 369 |
-
"nb_validation_samples = 955\n",
|
| 370 |
-
"epochs = 10\n",
|
| 371 |
-
"\n",
|
| 372 |
-
"history = model.fit_generator(\n",
|
| 373 |
-
" train_generator,\n",
|
| 374 |
-
" steps_per_epoch = nb_train_samples // batch_size,\n",
|
| 375 |
-
" epochs = epochs,\n",
|
| 376 |
-
" callbacks = callbacks,\n",
|
| 377 |
-
" validation_data = validation_generator,\n",
|
| 378 |
-
" validation_steps = nb_validation_samples // batch_size)"
|
| 379 |
-
]
|
| 380 |
-
},
|
| 381 |
-
{
|
| 382 |
-
"cell_type": "markdown",
|
| 383 |
-
"metadata": {},
|
| 384 |
-
"source": [
|
| 385 |
-
"#### Getting our Class Labels"
|
| 386 |
-
]
|
| 387 |
-
},
|
| 388 |
-
{
|
| 389 |
-
"cell_type": "code",
|
| 390 |
-
"execution_count": 39,
|
| 391 |
-
"metadata": {},
|
| 392 |
-
"outputs": [
|
| 393 |
-
{
|
| 394 |
-
"data": {
|
| 395 |
-
"text/plain": [
|
| 396 |
-
"{0: 'Chandler', 1: 'Joey', 2: 'Pheobe', 3: 'Rachel'}"
|
| 397 |
-
]
|
| 398 |
-
},
|
| 399 |
-
"execution_count": 39,
|
| 400 |
-
"metadata": {},
|
| 401 |
-
"output_type": "execute_result"
|
| 402 |
-
}
|
| 403 |
-
],
|
| 404 |
-
"source": [
|
| 405 |
-
"class_labels = validation_generator.class_indices\n",
|
| 406 |
-
"class_labels = {v: k for k, v in class_labels.items()}\n",
|
| 407 |
-
"classes = list(class_labels.values())\n",
|
| 408 |
-
"class_labels"
|
| 409 |
-
]
|
| 410 |
-
},
|
| 411 |
-
{
|
| 412 |
-
"cell_type": "code",
|
| 413 |
-
"execution_count": null,
|
| 414 |
-
"metadata": {},
|
| 415 |
-
"outputs": [],
|
| 416 |
-
"source": [
|
| 417 |
-
"# Load our model\n",
|
| 418 |
-
"from keras.models import load_model\n",
|
| 419 |
-
"\n",
|
| 420 |
-
"classifier = load_model('/home/deeplearningcv/DeepLearningCV/Trained Models/face_recognition_friends_vgg.h5')"
|
| 421 |
-
]
|
| 422 |
-
},
|
| 423 |
-
{
|
| 424 |
-
"cell_type": "markdown",
|
| 425 |
-
"metadata": {},
|
| 426 |
-
"source": [
|
| 427 |
-
"### Testing our model on some real video"
|
| 428 |
-
]
|
| 429 |
-
},
|
| 430 |
-
{
|
| 431 |
-
"cell_type": "code",
|
| 432 |
-
"execution_count": 43,
|
| 433 |
-
"metadata": {},
|
| 434 |
-
"outputs": [],
|
| 435 |
-
"source": [
|
| 436 |
-
"from os import listdir\n",
|
| 437 |
-
"from os.path import isfile, join\n",
|
| 438 |
-
"import os\n",
|
| 439 |
-
"import cv2\n",
|
| 440 |
-
"import numpy as np\n",
|
| 441 |
-
"\n",
|
| 442 |
-
"\n",
|
| 443 |
-
"face_classes = {0: 'Chandler', 1: 'Joey', 2: 'Pheobe', 3: 'Rachel'}\n",
|
| 444 |
-
"\n",
|
| 445 |
-
"def draw_label(image, point, label, font=cv2.FONT_HERSHEY_SIMPLEX,\n",
|
| 446 |
-
" font_scale=0.8, thickness=1):\n",
|
| 447 |
-
" size = cv2.getTextSize(label, font, font_scale, thickness)[0]\n",
|
| 448 |
-
" x, y = point\n",
|
| 449 |
-
" cv2.rectangle(image, (x, y - size[1]), (x + size[0], y), (255, 0, 0), cv2.FILLED)\n",
|
| 450 |
-
" cv2.putText(image, label, point, font, font_scale, (255, 255, 255), thickness, lineType=cv2.LINE_AA)\n",
|
| 451 |
-
" \n",
|
| 452 |
-
"margin = 0.2\n",
|
| 453 |
-
"# load model and weights\n",
|
| 454 |
-
"img_size = 64\n",
|
| 455 |
-
"\n",
|
| 456 |
-
"detector = dlib.get_frontal_face_detector()\n",
|
| 457 |
-
"\n",
|
| 458 |
-
"cap = cv2.VideoCapture('testfriends.mp4')\n",
|
| 459 |
-
"\n",
|
| 460 |
-
"while True:\n",
|
| 461 |
-
" ret, frame = cap.read()\n",
|
| 462 |
-
" frame = cv2.resize(frame, None, fx=0.5, fy=0.5, interpolation = cv2.INTER_LINEAR)\n",
|
| 463 |
-
" preprocessed_faces = [] \n",
|
| 464 |
-
" \n",
|
| 465 |
-
" input_img = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n",
|
| 466 |
-
" img_h, img_w, _ = np.shape(input_img)\n",
|
| 467 |
-
" detected = detector(frame, 1)\n",
|
| 468 |
-
" faces = np.empty((len(detected), img_size, img_size, 3))\n",
|
| 469 |
-
" \n",
|
| 470 |
-
" preprocessed_faces_emo = []\n",
|
| 471 |
-
" if len(detected) > 0:\n",
|
| 472 |
-
" for i, d in enumerate(detected):\n",
|
| 473 |
-
" x1, y1, x2, y2, w, h = d.left(), d.top(), d.right() + 1, d.bottom() + 1, d.width(), d.height()\n",
|
| 474 |
-
" xw1 = max(int(x1 - margin * w), 0)\n",
|
| 475 |
-
" yw1 = max(int(y1 - margin * h), 0)\n",
|
| 476 |
-
" xw2 = min(int(x2 + margin * w), img_w - 1)\n",
|
| 477 |
-
" yw2 = min(int(y2 + margin * h), img_h - 1)\n",
|
| 478 |
-
" cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0), 2)\n",
|
| 479 |
-
" # cv2.rectangle(img, (xw1, yw1), (xw2, yw2), (255, 0, 0), 2)\n",
|
| 480 |
-
" #faces[i, :, :, :] = cv2.resize(frame[yw1:yw2 + 1, xw1:xw2 + 1, :], (img_size, img_size))\n",
|
| 481 |
-
" face = frame[yw1:yw2 + 1, xw1:xw2 + 1, :]\n",
|
| 482 |
-
" face = cv2.resize(face, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 483 |
-
" face = face.astype(\"float\") / 255.0\n",
|
| 484 |
-
" face = img_to_array(face)\n",
|
| 485 |
-
" face = np.expand_dims(face, axis=0)\n",
|
| 486 |
-
" preprocessed_faces.append(face)\n",
|
| 487 |
-
"\n",
|
| 488 |
-
" # make a prediction for Emotion \n",
|
| 489 |
-
" face_labels = []\n",
|
| 490 |
-
" for i, d in enumerate(detected):\n",
|
| 491 |
-
" preds = classifier.predict(preprocessed_faces[i])[0]\n",
|
| 492 |
-
" face_labels.append(face_classes[preds.argmax()])\n",
|
| 493 |
-
" \n",
|
| 494 |
-
" # draw results\n",
|
| 495 |
-
" for i, d in enumerate(detected):\n",
|
| 496 |
-
" label = \"{}\".format(face_labels[i])\n",
|
| 497 |
-
" draw_label(frame, (d.left(), d.top()), label)\n",
|
| 498 |
-
"\n",
|
| 499 |
-
" cv2.imshow(\"Friend Character Identifier\", frame)\n",
|
| 500 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 501 |
-
" break\n",
|
| 502 |
-
"\n",
|
| 503 |
-
"cap.release()\n",
|
| 504 |
-
"cv2.destroyAllWindows() "
|
| 505 |
-
]
|
| 506 |
-
},
|
| 507 |
-
{
|
| 508 |
-
"cell_type": "code",
|
| 509 |
-
"execution_count": null,
|
| 510 |
-
"metadata": {},
|
| 511 |
-
"outputs": [],
|
| 512 |
-
"source": []
|
| 513 |
-
}
|
| 514 |
-
],
|
| 515 |
-
"metadata": {
|
| 516 |
-
"kernelspec": {
|
| 517 |
-
"display_name": "Python 3",
|
| 518 |
-
"language": "python",
|
| 519 |
-
"name": "python3"
|
| 520 |
-
},
|
| 521 |
-
"language_info": {
|
| 522 |
-
"codemirror_mode": {
|
| 523 |
-
"name": "ipython",
|
| 524 |
-
"version": 3
|
| 525 |
-
},
|
| 526 |
-
"file_extension": ".py",
|
| 527 |
-
"mimetype": "text/x-python",
|
| 528 |
-
"name": "python",
|
| 529 |
-
"nbconvert_exporter": "python",
|
| 530 |
-
"pygments_lexer": "ipython3",
|
| 531 |
-
"version": "3.6.6"
|
| 532 |
-
}
|
| 533 |
-
},
|
| 534 |
-
"nbformat": 4,
|
| 535 |
-
"nbformat_minor": 2
|
| 536 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:169cb30aeae1a32fe7c1ff8da5ecd1c922876310843bd7f89d66dc69880da545
|
| 3 |
+
size 23092
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25. Face Recognition/.ipynb_checkpoints/25.2 Face Recogition - Matching Faces-checkpoint.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
25. Face Recognition/.ipynb_checkpoints/25.3 Face Recogition - One Shot Learning-checkpoint.ipynb
CHANGED
|
@@ -1,406 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"## 1. Extract faces from pictures of people \n",
|
| 8 |
-
"### Instrutions:\n",
|
| 9 |
-
"- Place photos of people (one face visible) in the folder called \"./people\"\n",
|
| 10 |
-
"- Replace my photo titled \"Rajeev.jpg\" with a piture of your face for testing on a webcam\n",
|
| 11 |
-
"- Faces are extracted using the haarcascade_frontalface_default detector model\n",
|
| 12 |
-
"- Extracted faces are placed in the folder called \"./group_of_faces\"\n",
|
| 13 |
-
"#### We are extracting the faces needed for our one-shot learning model, it will load 5 extracted faces"
|
| 14 |
-
]
|
| 15 |
-
},
|
| 16 |
-
{
|
| 17 |
-
"cell_type": "code",
|
| 18 |
-
"execution_count": 1,
|
| 19 |
-
"metadata": {},
|
| 20 |
-
"outputs": [
|
| 21 |
-
{
|
| 22 |
-
"name": "stdout",
|
| 23 |
-
"output_type": "stream",
|
| 24 |
-
"text": [
|
| 25 |
-
"Collected image names\n"
|
| 26 |
-
]
|
| 27 |
-
}
|
| 28 |
-
],
|
| 29 |
-
"source": [
|
| 30 |
-
"# The code below extracts faces from images and places them in the folder\n",
|
| 31 |
-
"from os import listdir\n",
|
| 32 |
-
"from os.path import isfile, join\n",
|
| 33 |
-
"import cv2\n",
|
| 34 |
-
"\n",
|
| 35 |
-
"# Loading out HAARCascade Face Detector \n",
|
| 36 |
-
"face_detector = cv2.CascadeClassifier('Haarcascades/haarcascade_frontalface_default.xml')\n",
|
| 37 |
-
"\n",
|
| 38 |
-
"# Directory of image of persons we'll be extracting faces frommy\n",
|
| 39 |
-
"mypath = \"./people/\"\n",
|
| 40 |
-
"image_file_names = [f for f in listdir(mypath) if isfile(join(mypath, f))]\n",
|
| 41 |
-
"print(\"Collected image names\")\n",
|
| 42 |
-
"\n",
|
| 43 |
-
"for image_name in image_file_names:\n",
|
| 44 |
-
" person_image = cv2.imread(mypath+image_name)\n",
|
| 45 |
-
" face_info = face_detector.detectMultiScale(person_image, 1.3, 5)\n",
|
| 46 |
-
" for (x,y,w,h) in face_info:\n",
|
| 47 |
-
" face = person_image[y:y+h, x:x+w]\n",
|
| 48 |
-
" roi = cv2.resize(face, (128, 128), interpolation = cv2.INTER_CUBIC)\n",
|
| 49 |
-
" path = \"./group_of_faces/\" + \"face_\" + image_name \n",
|
| 50 |
-
" cv2.imwrite(path, roi)\n",
|
| 51 |
-
" cv2.imshow(\"face\", roi)\n",
|
| 52 |
-
" \n",
|
| 53 |
-
" cv2.waitKey(0)\n",
|
| 54 |
-
"cv2.destroyAllWindows()"
|
| 55 |
-
]
|
| 56 |
-
},
|
| 57 |
-
{
|
| 58 |
-
"cell_type": "markdown",
|
| 59 |
-
"metadata": {},
|
| 60 |
-
"source": [
|
| 61 |
-
"## 2. Load our VGGFaceModel \n",
|
| 62 |
-
"- This block of code defines the VGGFace model (which we use later) and loads the model"
|
| 63 |
-
]
|
| 64 |
-
},
|
| 65 |
-
{
|
| 66 |
-
"cell_type": "code",
|
| 67 |
-
"execution_count": 3,
|
| 68 |
-
"metadata": {},
|
| 69 |
-
"outputs": [
|
| 70 |
-
{
|
| 71 |
-
"name": "stdout",
|
| 72 |
-
"output_type": "stream",
|
| 73 |
-
"text": [
|
| 74 |
-
"Model Loaded\n"
|
| 75 |
-
]
|
| 76 |
-
}
|
| 77 |
-
],
|
| 78 |
-
"source": [
|
| 79 |
-
"#author Sefik Ilkin Serengil\n",
|
| 80 |
-
"#you can find the documentation of this code from the following link: https://sefiks.com/2018/08/06/deep-face-recognition-with-keras/\n",
|
| 81 |
-
"\n",
|
| 82 |
-
"import numpy as np\n",
|
| 83 |
-
"import cv2\n",
|
| 84 |
-
"\n",
|
| 85 |
-
"from tensorflow.keras.models import Model, Sequential\n",
|
| 86 |
-
"from tensorflow.keras.layers import Input, Convolution2D, ZeroPadding2D, MaxPooling2D, Flatten, Dense, Dropout, Activation\n",
|
| 87 |
-
"from PIL import Image\n",
|
| 88 |
-
"from tensorflow.keras.preprocessing.image import load_img, save_img, img_to_array\n",
|
| 89 |
-
"from tensorflow.keras.applications.imagenet_utils import preprocess_input\n",
|
| 90 |
-
"from tensorflow.keras.preprocessing import image\n",
|
| 91 |
-
"import matplotlib.pyplot as plt\n",
|
| 92 |
-
"from os import listdir\n",
|
| 93 |
-
"\n",
|
| 94 |
-
"def preprocess_image(image_path):\n",
|
| 95 |
-
" \"\"\"Loads image from path and resizes it\"\"\"\n",
|
| 96 |
-
" img = load_img(image_path, target_size=(224, 224))\n",
|
| 97 |
-
" img = img_to_array(img)\n",
|
| 98 |
-
" img = np.expand_dims(img, axis=0)\n",
|
| 99 |
-
" img = preprocess_input(img)\n",
|
| 100 |
-
" return img\n",
|
| 101 |
-
"\n",
|
| 102 |
-
"model = Sequential()\n",
|
| 103 |
-
"model.add(ZeroPadding2D((1,1),input_shape=(224,224, 3)))\n",
|
| 104 |
-
"model.add(Convolution2D(64, (3, 3), activation='relu'))\n",
|
| 105 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 106 |
-
"model.add(Convolution2D(64, (3, 3), activation='relu'))\n",
|
| 107 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 108 |
-
"\n",
|
| 109 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 110 |
-
"model.add(Convolution2D(128, (3, 3), activation='relu'))\n",
|
| 111 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 112 |
-
"model.add(Convolution2D(128, (3, 3), activation='relu'))\n",
|
| 113 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 114 |
-
"\n",
|
| 115 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 116 |
-
"model.add(Convolution2D(256, (3, 3), activation='relu'))\n",
|
| 117 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 118 |
-
"model.add(Convolution2D(256, (3, 3), activation='relu'))\n",
|
| 119 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 120 |
-
"model.add(Convolution2D(256, (3, 3), activation='relu'))\n",
|
| 121 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 122 |
-
"\n",
|
| 123 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 124 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 125 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 126 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 127 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 128 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 129 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 130 |
-
"\n",
|
| 131 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 132 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 133 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 134 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 135 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 136 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 137 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 138 |
-
"\n",
|
| 139 |
-
"model.add(Convolution2D(4096, (7, 7), activation='relu'))\n",
|
| 140 |
-
"model.add(Dropout(0.5))\n",
|
| 141 |
-
"model.add(Convolution2D(4096, (1, 1), activation='relu'))\n",
|
| 142 |
-
"model.add(Dropout(0.5))\n",
|
| 143 |
-
"model.add(Convolution2D(2622, (1, 1)))\n",
|
| 144 |
-
"model.add(Flatten())\n",
|
| 145 |
-
"model.add(Activation('softmax'))\n",
|
| 146 |
-
"\n",
|
| 147 |
-
"#you can download pretrained weights from https://drive.google.com/file/d/1CPSeum3HpopfomUEK1gybeuIVoeJT_Eo/view?usp=sharing\n",
|
| 148 |
-
"from tensorflow.keras.models import model_from_json\n",
|
| 149 |
-
"model.load_weights('vgg_face_weights.h5')\n",
|
| 150 |
-
"\n",
|
| 151 |
-
"vgg_face_descriptor = Model(inputs=model.layers[0].input, outputs=model.layers[-2].output)\n",
|
| 152 |
-
"\n",
|
| 153 |
-
"model = vgg_face_descriptor\n",
|
| 154 |
-
"\n",
|
| 155 |
-
" \n",
|
| 156 |
-
"print(\"Model Loaded\")"
|
| 157 |
-
]
|
| 158 |
-
},
|
| 159 |
-
{
|
| 160 |
-
"cell_type": "markdown",
|
| 161 |
-
"metadata": {},
|
| 162 |
-
"source": [
|
| 163 |
-
"## 3. Test model using your Webcam\n",
|
| 164 |
-
"This code looks up the faces you extracted in the \"group_of_faces\" folder and uses the similarity (Cosine Similarity) to detect which faces is most similar to the one being extracted with your webcam."
|
| 165 |
-
]
|
| 166 |
-
},
|
| 167 |
-
{
|
| 168 |
-
"cell_type": "code",
|
| 169 |
-
"execution_count": 4,
|
| 170 |
-
"metadata": {},
|
| 171 |
-
"outputs": [
|
| 172 |
-
{
|
| 173 |
-
"name": "stdout",
|
| 174 |
-
"output_type": "stream",
|
| 175 |
-
"text": [
|
| 176 |
-
"Face representations retrieved successfully\n"
|
| 177 |
-
]
|
| 178 |
-
}
|
| 179 |
-
],
|
| 180 |
-
"source": [
|
| 181 |
-
"#points to your extracted faces\n",
|
| 182 |
-
"people_pictures = \"./group_of_faces/\"\n",
|
| 183 |
-
"\n",
|
| 184 |
-
"all_people_faces = dict()\n",
|
| 185 |
-
"\n",
|
| 186 |
-
"for file in listdir(people_pictures):\n",
|
| 187 |
-
" person_face, extension = file.split(\".\")\n",
|
| 188 |
-
" all_people_faces[person_face] = model.predict(preprocess_image('./group_of_faces/%s.jpg' % (person_face)))[0,:]\n",
|
| 189 |
-
"\n",
|
| 190 |
-
"print(\"Face representations retrieved successfully\")\n",
|
| 191 |
-
"\n",
|
| 192 |
-
"def findCosineSimilarity(source_representation, test_representation):\n",
|
| 193 |
-
" a = np.matmul(np.transpose(source_representation), test_representation)\n",
|
| 194 |
-
" b = np.sum(np.multiply(source_representation, source_representation))\n",
|
| 195 |
-
" c = np.sum(np.multiply(test_representation, test_representation))\n",
|
| 196 |
-
" return 1 - (a / (np.sqrt(b) * np.sqrt(c)))\n",
|
| 197 |
-
"\n",
|
| 198 |
-
"#Open Webcam\n",
|
| 199 |
-
"cap = cv2.VideoCapture(0) \n",
|
| 200 |
-
"\n",
|
| 201 |
-
"while(True):\n",
|
| 202 |
-
" ret, img = cap.read()\n",
|
| 203 |
-
" faces = face_detector.detectMultiScale(img, 1.3, 5)\n",
|
| 204 |
-
"\n",
|
| 205 |
-
" for (x,y,w,h) in faces:\n",
|
| 206 |
-
" if w > 100: #Adjust accordingly if your webcam resoluation is higher\n",
|
| 207 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2) #draw rectangle to main image\n",
|
| 208 |
-
" detected_face = img[int(y):int(y+h), int(x):int(x+w)] #crop detected face\n",
|
| 209 |
-
" detected_face = cv2.resize(detected_face, (224, 224)) #resize to 224x224\n",
|
| 210 |
-
"\n",
|
| 211 |
-
" img_pixels = image.img_to_array(detected_face)\n",
|
| 212 |
-
" img_pixels = np.expand_dims(img_pixels, axis = 0)\n",
|
| 213 |
-
" img_pixels /= 255\n",
|
| 214 |
-
"\n",
|
| 215 |
-
" captured_representation = model.predict(img_pixels)[0,:]\n",
|
| 216 |
-
"\n",
|
| 217 |
-
" found = 0\n",
|
| 218 |
-
" for i in all_people_faces:\n",
|
| 219 |
-
" person_name = i\n",
|
| 220 |
-
" representation = all_people_faces[i]\n",
|
| 221 |
-
"\n",
|
| 222 |
-
" similarity = findCosineSimilarity(representation, captured_representation)\n",
|
| 223 |
-
" if(similarity < 0.30):\n",
|
| 224 |
-
" cv2.putText(img, person_name[5:], (int(x+w+15), int(y-12)), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 0, 255), 2)\n",
|
| 225 |
-
" found = 1\n",
|
| 226 |
-
" break\n",
|
| 227 |
-
"\n",
|
| 228 |
-
" #connect face and text\n",
|
| 229 |
-
" cv2.line(img,(int((x+x+w)/2),y+15),(x+w,y-20),(255, 0, 0),1)\n",
|
| 230 |
-
" cv2.line(img,(x+w,y-20),(x+w+10,y-20),(255, 0, 0),1)\n",
|
| 231 |
-
"\n",
|
| 232 |
-
" if(found == 0): #if found image is not in our people database\n",
|
| 233 |
-
" cv2.putText(img, 'unknown', (int(x+w+15), int(y-12)), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2)\n",
|
| 234 |
-
"\n",
|
| 235 |
-
" cv2.imshow('img',img)\n",
|
| 236 |
-
"\n",
|
| 237 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 238 |
-
" break\n",
|
| 239 |
-
" \n",
|
| 240 |
-
"cap.release()\n",
|
| 241 |
-
"cv2.destroyAllWindows()"
|
| 242 |
-
]
|
| 243 |
-
},
|
| 244 |
-
{
|
| 245 |
-
"cell_type": "markdown",
|
| 246 |
-
"metadata": {},
|
| 247 |
-
"source": [
|
| 248 |
-
"## Test on a video\n",
|
| 249 |
-
"### Since we're using the Friends TV Series characters, let's extract the faces from the images I placed in the \"./friends\" folder"
|
| 250 |
-
]
|
| 251 |
-
},
|
| 252 |
-
{
|
| 253 |
-
"cell_type": "code",
|
| 254 |
-
"execution_count": 5,
|
| 255 |
-
"metadata": {},
|
| 256 |
-
"outputs": [
|
| 257 |
-
{
|
| 258 |
-
"name": "stdout",
|
| 259 |
-
"output_type": "stream",
|
| 260 |
-
"text": [
|
| 261 |
-
"Collected image names\n"
|
| 262 |
-
]
|
| 263 |
-
}
|
| 264 |
-
],
|
| 265 |
-
"source": [
|
| 266 |
-
"from os import listdir\n",
|
| 267 |
-
"from os.path import isfile, join\n",
|
| 268 |
-
"import cv2\n",
|
| 269 |
-
"\n",
|
| 270 |
-
"# Loading out HAARCascade Face Detector \n",
|
| 271 |
-
"face_detector = cv2.CascadeClassifier('Haarcascades/haarcascade_frontalface_default.xml')\n",
|
| 272 |
-
"\n",
|
| 273 |
-
"# Directory of image of persons we'll be extracting faces frommy\n",
|
| 274 |
-
"mypath = \"./friends/\"\n",
|
| 275 |
-
"image_file_names = [f for f in listdir(mypath) if isfile(join(mypath, f))]\n",
|
| 276 |
-
"print(\"Collected image names\")\n",
|
| 277 |
-
"\n",
|
| 278 |
-
"for image_name in image_file_names:\n",
|
| 279 |
-
" person_image = cv2.imread(mypath+image_name)\n",
|
| 280 |
-
" face_info = face_detector.detectMultiScale(person_image, 1.3, 5)\n",
|
| 281 |
-
" for (x,y,w,h) in face_info:\n",
|
| 282 |
-
" face = person_image[y:y+h, x:x+w]\n",
|
| 283 |
-
" roi = cv2.resize(face, (128, 128), interpolation = cv2.INTER_CUBIC)\n",
|
| 284 |
-
" path = \"./friends_faces/\" + \"face_\" + image_name \n",
|
| 285 |
-
" cv2.imwrite(path, roi)\n",
|
| 286 |
-
" cv2.imshow(\"face\", roi)\n",
|
| 287 |
-
" \n",
|
| 288 |
-
" cv2.waitKey(0)\n",
|
| 289 |
-
"cv2.destroyAllWindows()"
|
| 290 |
-
]
|
| 291 |
-
},
|
| 292 |
-
{
|
| 293 |
-
"cell_type": "markdown",
|
| 294 |
-
"metadata": {},
|
| 295 |
-
"source": [
|
| 296 |
-
"### Again, we load our faces from the \"friends_faces\" directory and we run our face classifier model our test video"
|
| 297 |
-
]
|
| 298 |
-
},
|
| 299 |
-
{
|
| 300 |
-
"cell_type": "code",
|
| 301 |
-
"execution_count": 10,
|
| 302 |
-
"metadata": {},
|
| 303 |
-
"outputs": [
|
| 304 |
-
{
|
| 305 |
-
"name": "stdout",
|
| 306 |
-
"output_type": "stream",
|
| 307 |
-
"text": [
|
| 308 |
-
"Face representations retrieved successfully\n"
|
| 309 |
-
]
|
| 310 |
-
}
|
| 311 |
-
],
|
| 312 |
-
"source": [
|
| 313 |
-
"#points to your extracted faces\n",
|
| 314 |
-
"people_pictures = \"./friends_faces/\"\n",
|
| 315 |
-
"\n",
|
| 316 |
-
"all_people_faces = dict()\n",
|
| 317 |
-
"\n",
|
| 318 |
-
"for file in listdir(people_pictures):\n",
|
| 319 |
-
" person_face, extension = file.split(\".\")\n",
|
| 320 |
-
" all_people_faces[person_face] = model.predict(preprocess_image('./friends_faces/%s.jpg' % (person_face)))[0,:]\n",
|
| 321 |
-
"\n",
|
| 322 |
-
"print(\"Face representations retrieved successfully\")\n",
|
| 323 |
-
"\n",
|
| 324 |
-
"def findCosineSimilarity(source_representation, test_representation):\n",
|
| 325 |
-
" a = np.matmul(np.transpose(source_representation), test_representation)\n",
|
| 326 |
-
" b = np.sum(np.multiply(source_representation, source_representation))\n",
|
| 327 |
-
" c = np.sum(np.multiply(test_representation, test_representation))\n",
|
| 328 |
-
" return 1 - (a / (np.sqrt(b) * np.sqrt(c)))\n",
|
| 329 |
-
"\n",
|
| 330 |
-
"cap = cv2.VideoCapture('testfriends.mp4')\n",
|
| 331 |
-
"\n",
|
| 332 |
-
"while(True):\n",
|
| 333 |
-
" ret, img = cap.read()\n",
|
| 334 |
-
" img = cv2.resize(img, (320, 180)) # Re-size video to as smaller size to improve face detection speed\n",
|
| 335 |
-
" faces = face_detector.detectMultiScale(img, 1.3, 5)\n",
|
| 336 |
-
"\n",
|
| 337 |
-
" for (x,y,w,h) in faces:\n",
|
| 338 |
-
" if w > 13: \n",
|
| 339 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2) #draw rectangle to main image\n",
|
| 340 |
-
"\n",
|
| 341 |
-
" detected_face = img[int(y):int(y+h), int(x):int(x+w)] #crop detected face\n",
|
| 342 |
-
" detected_face = cv2.resize(detected_face, (224, 224)) #resize to 224x224\n",
|
| 343 |
-
"\n",
|
| 344 |
-
" img_pixels = image.img_to_array(detected_face)\n",
|
| 345 |
-
" img_pixels = np.expand_dims(img_pixels, axis = 0)\n",
|
| 346 |
-
" img_pixels /= 255\n",
|
| 347 |
-
"\n",
|
| 348 |
-
" captured_representation = model.predict(img_pixels)[0,:]\n",
|
| 349 |
-
"\n",
|
| 350 |
-
" found = 0\n",
|
| 351 |
-
" for i in all_people_faces:\n",
|
| 352 |
-
" person_name = i\n",
|
| 353 |
-
" representation = all_people_faces[i]\n",
|
| 354 |
-
"\n",
|
| 355 |
-
" similarity = findCosineSimilarity(representation, captured_representation)\n",
|
| 356 |
-
" if(similarity < 0.30):\n",
|
| 357 |
-
" cv2.putText(img, person_name[5:], (int(x+w+15), int(y-12)), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 0, 255), 2)\n",
|
| 358 |
-
" found = 1\n",
|
| 359 |
-
" break\n",
|
| 360 |
-
"\n",
|
| 361 |
-
" #connect face and text\n",
|
| 362 |
-
" cv2.line(img,(int((x+x+w)/2),y+15),(x+w,y-20),(255, 0, 0),1)\n",
|
| 363 |
-
" cv2.line(img,(x+w,y-20),(x+w+10,y-20),(255, 0, 0),1)\n",
|
| 364 |
-
"\n",
|
| 365 |
-
" if(found == 0): #if found image is not in our people database\n",
|
| 366 |
-
" cv2.putText(img, 'unknown', (int(x+w+15), int(y-12)), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2)\n",
|
| 367 |
-
"\n",
|
| 368 |
-
" cv2.imshow('img',img)\n",
|
| 369 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 370 |
-
" break\n",
|
| 371 |
-
"\n",
|
| 372 |
-
"#kill open cv things\n",
|
| 373 |
-
"cap.release()\n",
|
| 374 |
-
"cv2.destroyAllWindows()"
|
| 375 |
-
]
|
| 376 |
-
},
|
| 377 |
-
{
|
| 378 |
-
"cell_type": "code",
|
| 379 |
-
"execution_count": null,
|
| 380 |
-
"metadata": {},
|
| 381 |
-
"outputs": [],
|
| 382 |
-
"source": []
|
| 383 |
-
}
|
| 384 |
-
],
|
| 385 |
-
"metadata": {
|
| 386 |
-
"kernelspec": {
|
| 387 |
-
"display_name": "Python 3",
|
| 388 |
-
"language": "python",
|
| 389 |
-
"name": "python3"
|
| 390 |
-
},
|
| 391 |
-
"language_info": {
|
| 392 |
-
"codemirror_mode": {
|
| 393 |
-
"name": "ipython",
|
| 394 |
-
"version": 3
|
| 395 |
-
},
|
| 396 |
-
"file_extension": ".py",
|
| 397 |
-
"mimetype": "text/x-python",
|
| 398 |
-
"name": "python",
|
| 399 |
-
"nbconvert_exporter": "python",
|
| 400 |
-
"pygments_lexer": "ipython3",
|
| 401 |
-
"version": "3.7.4"
|
| 402 |
-
}
|
| 403 |
-
},
|
| 404 |
-
"nbformat": 4,
|
| 405 |
-
"nbformat_minor": 2
|
| 406 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:3cd80b0b58742fc9c3563f3d11cf3e608b8b533ab498516b25072b291751f420
|
| 3 |
+
size 15335
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25. Face Recognition/.ipynb_checkpoints/Face Recogition - Matching Faces-checkpoint.ipynb
CHANGED
|
@@ -1,6 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"nbformat": 4,
|
| 5 |
-
"nbformat_minor": 2
|
| 6 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:188143f20ba64ead32853235735c589180056b6c7c47541744479767de696e37
|
| 3 |
+
size 72
|
|
|
|
|
|
|
|
|
25. Face Recognition/.ipynb_checkpoints/Face Recogition - One Shot Learning-checkpoint.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
25. Face Recognition/25.0 Face Extraction from Video - Build Dataset.ipynb
CHANGED
|
@@ -1,93 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Extracting the faces from a video"
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "code",
|
| 12 |
-
"execution_count": null,
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"outputs": [],
|
| 15 |
-
"source": [
|
| 16 |
-
"from os import listdir\n",
|
| 17 |
-
"from os.path import isfile, join\n",
|
| 18 |
-
"import os\n",
|
| 19 |
-
"import cv2\n",
|
| 20 |
-
"import dlib\n",
|
| 21 |
-
"import numpy as np\n",
|
| 22 |
-
"\n",
|
| 23 |
-
"# Define Image Path Here\n",
|
| 24 |
-
"image_path = \"./images/\"\n",
|
| 25 |
-
"\n",
|
| 26 |
-
"def draw_label(image, point, label, font=cv2.FONT_HERSHEY_SIMPLEX,\n",
|
| 27 |
-
" font_scale=0.8, thickness=1):\n",
|
| 28 |
-
" size = cv2.getTextSize(label, font, font_scale, thickness)[0]\n",
|
| 29 |
-
" x, y = point\n",
|
| 30 |
-
" cv2.rectangle(image, (x, y - size[1]), (x + size[0], y), (255, 0, 0), cv2.FILLED)\n",
|
| 31 |
-
" cv2.putText(image, label, point, font, font_scale, (255, 255, 255), thickness, lineType=cv2.LINE_AA)\n",
|
| 32 |
-
" \n",
|
| 33 |
-
"detector = dlib.get_frontal_face_detector()\n",
|
| 34 |
-
"\n",
|
| 35 |
-
"# Initialize Webcam\n",
|
| 36 |
-
"cap = cv2.VideoCapture('testfriends.mp4')\n",
|
| 37 |
-
"img_size = 64\n",
|
| 38 |
-
"margin = 0.2\n",
|
| 39 |
-
"frame_count = 0\n",
|
| 40 |
-
"\n",
|
| 41 |
-
"while True:\n",
|
| 42 |
-
" ret, frame = cap.read()\n",
|
| 43 |
-
" frame_count += 1\n",
|
| 44 |
-
" print(frame_count) \n",
|
| 45 |
-
" \n",
|
| 46 |
-
" input_img = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n",
|
| 47 |
-
" img_h, img_w, _ = np.shape(input_img)\n",
|
| 48 |
-
" detected = detector(frame, 1)\n",
|
| 49 |
-
" faces = []\n",
|
| 50 |
-
" \n",
|
| 51 |
-
" if len(detected) > 0:\n",
|
| 52 |
-
" for i, d in enumerate(detected):\n",
|
| 53 |
-
" x1, y1, x2, y2, w, h = d.left(), d.top(), d.right() + 1, d.bottom() + 1, d.width(), d.height()\n",
|
| 54 |
-
" xw1 = max(int(x1 - margin * w), 0)\n",
|
| 55 |
-
" yw1 = max(int(y1 - margin * h), 0)\n",
|
| 56 |
-
" xw2 = min(int(x2 + margin * w), img_w - 1)\n",
|
| 57 |
-
" yw2 = min(int(y2 + margin * h), img_h - 1)\n",
|
| 58 |
-
" face = frame[yw1:yw2 + 1, xw1:xw2 + 1, :]\n",
|
| 59 |
-
" file_name = \"./faces/\"+str(frame_count)+\"_\"+str(i)+\".jpg\"\n",
|
| 60 |
-
" cv2.imwrite(file_name, face)\n",
|
| 61 |
-
" cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0), 2)\n",
|
| 62 |
-
"\n",
|
| 63 |
-
" cv2.imshow(\"Face Detector\", frame)\n",
|
| 64 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 65 |
-
" break\n",
|
| 66 |
-
"\n",
|
| 67 |
-
"cap.release()\n",
|
| 68 |
-
"cv2.destroyAllWindows() "
|
| 69 |
-
]
|
| 70 |
-
}
|
| 71 |
-
],
|
| 72 |
-
"metadata": {
|
| 73 |
-
"kernelspec": {
|
| 74 |
-
"display_name": "Python 3",
|
| 75 |
-
"language": "python",
|
| 76 |
-
"name": "python3"
|
| 77 |
-
},
|
| 78 |
-
"language_info": {
|
| 79 |
-
"codemirror_mode": {
|
| 80 |
-
"name": "ipython",
|
| 81 |
-
"version": 3
|
| 82 |
-
},
|
| 83 |
-
"file_extension": ".py",
|
| 84 |
-
"mimetype": "text/x-python",
|
| 85 |
-
"name": "python",
|
| 86 |
-
"nbconvert_exporter": "python",
|
| 87 |
-
"pygments_lexer": "ipython3",
|
| 88 |
-
"version": "3.7.4"
|
| 89 |
-
}
|
| 90 |
-
},
|
| 91 |
-
"nbformat": 4,
|
| 92 |
-
"nbformat_minor": 2
|
| 93 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:2b36b2231a5feaa366136c6aa0dc9026c9ec3b2b83bce6e52eaa64a462fc9786
|
| 3 |
+
size 2867
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25. Face Recognition/25.1 Face Recognition - Friends Characters - Train and Test.ipynb
CHANGED
|
@@ -1,521 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# Basic Deep Learning Face Recogntion\n",
|
| 8 |
-
"## Building a Friends TV Show Character Identifier"
|
| 9 |
-
]
|
| 10 |
-
},
|
| 11 |
-
{
|
| 12 |
-
"cell_type": "markdown",
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"source": [
|
| 15 |
-
"## The learning objective of this lesson (25.1) is the create a 'dumb' face classifer using our LittleVGG model. We are simply training it with 100s of pictures of each Friends Character, and testing our model using a Test Video. \n",
|
| 16 |
-
"\n",
|
| 17 |
-
"## You will see how this is an in-effective way to do Face Recognition, why?\n",
|
| 18 |
-
"Because a traditional CNN will be looking at small subsections of a face to identify it. However, the angle and deformation of a face can easily throw off our model, making it mis-classify and match it to faces that aren't correct - this will happen a lot especially when our training data looks different to our test data. Imagine if the training data, one face was titled downward mostly. Our CNN will more likely identify any future face on our test data that looks downward, to be that face."
|
| 19 |
-
]
|
| 20 |
-
},
|
| 21 |
-
{
|
| 22 |
-
"cell_type": "markdown",
|
| 23 |
-
"metadata": {},
|
| 24 |
-
"source": [
|
| 25 |
-
"### Let's train our model\n",
|
| 26 |
-
"I've created a dataset with the faces of 4 Friends characters taken from a handful of different scenes."
|
| 27 |
-
]
|
| 28 |
-
},
|
| 29 |
-
{
|
| 30 |
-
"cell_type": "code",
|
| 31 |
-
"execution_count": 1,
|
| 32 |
-
"metadata": {},
|
| 33 |
-
"outputs": [
|
| 34 |
-
{
|
| 35 |
-
"name": "stdout",
|
| 36 |
-
"output_type": "stream",
|
| 37 |
-
"text": [
|
| 38 |
-
"Found 2663 images belonging to 4 classes.\n",
|
| 39 |
-
"Found 955 images belonging to 4 classes.\n"
|
| 40 |
-
]
|
| 41 |
-
}
|
| 42 |
-
],
|
| 43 |
-
"source": [
|
| 44 |
-
"from __future__ import print_function\n",
|
| 45 |
-
"import tensorflow as tf \n",
|
| 46 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 47 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 48 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten, BatchNormalization\n",
|
| 49 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
|
| 50 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 51 |
-
"import os\n",
|
| 52 |
-
"\n",
|
| 53 |
-
"num_classes = 4\n",
|
| 54 |
-
"img_rows, img_cols = 48, 48\n",
|
| 55 |
-
"batch_size = 16\n",
|
| 56 |
-
"\n",
|
| 57 |
-
"train_data_dir = './faces/train'\n",
|
| 58 |
-
"validation_data_dir = './faces/validation'\n",
|
| 59 |
-
"\n",
|
| 60 |
-
"# Let's use some data augmentaiton \n",
|
| 61 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 62 |
-
" rescale=1./255,\n",
|
| 63 |
-
" rotation_range=30,\n",
|
| 64 |
-
" shear_range=0.3,\n",
|
| 65 |
-
" zoom_range=0.3,\n",
|
| 66 |
-
" width_shift_range=0.4,\n",
|
| 67 |
-
" height_shift_range=0.4,\n",
|
| 68 |
-
" horizontal_flip=True,\n",
|
| 69 |
-
" fill_mode='nearest')\n",
|
| 70 |
-
" \n",
|
| 71 |
-
"validation_datagen = ImageDataGenerator(rescale=1./255)\n",
|
| 72 |
-
" \n",
|
| 73 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 74 |
-
" train_data_dir,\n",
|
| 75 |
-
" target_size=(img_rows, img_cols),\n",
|
| 76 |
-
" batch_size=batch_size,\n",
|
| 77 |
-
" class_mode='categorical',\n",
|
| 78 |
-
" shuffle=True)\n",
|
| 79 |
-
" \n",
|
| 80 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 81 |
-
" validation_data_dir,\n",
|
| 82 |
-
" target_size=(img_rows, img_cols),\n",
|
| 83 |
-
" batch_size=batch_size,\n",
|
| 84 |
-
" class_mode='categorical',\n",
|
| 85 |
-
" shuffle=True)"
|
| 86 |
-
]
|
| 87 |
-
},
|
| 88 |
-
{
|
| 89 |
-
"cell_type": "code",
|
| 90 |
-
"execution_count": 3,
|
| 91 |
-
"metadata": {},
|
| 92 |
-
"outputs": [],
|
| 93 |
-
"source": [
|
| 94 |
-
"#Our Keras imports\n",
|
| 95 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 96 |
-
"from tensorflow.keras.layers import BatchNormalization\n",
|
| 97 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
|
| 98 |
-
"from tensorflow.keras.layers import ELU\n",
|
| 99 |
-
"from tensorflow.keras.layers import Activation, Flatten, Dropout, Dense"
|
| 100 |
-
]
|
| 101 |
-
},
|
| 102 |
-
{
|
| 103 |
-
"cell_type": "markdown",
|
| 104 |
-
"metadata": {},
|
| 105 |
-
"source": [
|
| 106 |
-
"### Creating a simple VGG based model for Face Recognition"
|
| 107 |
-
]
|
| 108 |
-
},
|
| 109 |
-
{
|
| 110 |
-
"cell_type": "code",
|
| 111 |
-
"execution_count": 4,
|
| 112 |
-
"metadata": {},
|
| 113 |
-
"outputs": [
|
| 114 |
-
{
|
| 115 |
-
"name": "stdout",
|
| 116 |
-
"output_type": "stream",
|
| 117 |
-
"text": [
|
| 118 |
-
"Model: \"sequential\"\n",
|
| 119 |
-
"_________________________________________________________________\n",
|
| 120 |
-
"Layer (type) Output Shape Param # \n",
|
| 121 |
-
"=================================================================\n",
|
| 122 |
-
"conv2d (Conv2D) (None, 48, 48, 32) 896 \n",
|
| 123 |
-
"_________________________________________________________________\n",
|
| 124 |
-
"activation (Activation) (None, 48, 48, 32) 0 \n",
|
| 125 |
-
"_________________________________________________________________\n",
|
| 126 |
-
"batch_normalization (BatchNo (None, 48, 48, 32) 128 \n",
|
| 127 |
-
"_________________________________________________________________\n",
|
| 128 |
-
"conv2d_1 (Conv2D) (None, 48, 48, 32) 9248 \n",
|
| 129 |
-
"_________________________________________________________________\n",
|
| 130 |
-
"activation_1 (Activation) (None, 48, 48, 32) 0 \n",
|
| 131 |
-
"_________________________________________________________________\n",
|
| 132 |
-
"batch_normalization_1 (Batch (None, 48, 48, 32) 128 \n",
|
| 133 |
-
"_________________________________________________________________\n",
|
| 134 |
-
"max_pooling2d (MaxPooling2D) (None, 24, 24, 32) 0 \n",
|
| 135 |
-
"_________________________________________________________________\n",
|
| 136 |
-
"dropout (Dropout) (None, 24, 24, 32) 0 \n",
|
| 137 |
-
"_________________________________________________________________\n",
|
| 138 |
-
"conv2d_2 (Conv2D) (None, 24, 24, 64) 18496 \n",
|
| 139 |
-
"_________________________________________________________________\n",
|
| 140 |
-
"activation_2 (Activation) (None, 24, 24, 64) 0 \n",
|
| 141 |
-
"_________________________________________________________________\n",
|
| 142 |
-
"batch_normalization_2 (Batch (None, 24, 24, 64) 256 \n",
|
| 143 |
-
"_________________________________________________________________\n",
|
| 144 |
-
"conv2d_3 (Conv2D) (None, 24, 24, 64) 36928 \n",
|
| 145 |
-
"_________________________________________________________________\n",
|
| 146 |
-
"activation_3 (Activation) (None, 24, 24, 64) 0 \n",
|
| 147 |
-
"_________________________________________________________________\n",
|
| 148 |
-
"batch_normalization_3 (Batch (None, 24, 24, 64) 256 \n",
|
| 149 |
-
"_________________________________________________________________\n",
|
| 150 |
-
"max_pooling2d_1 (MaxPooling2 (None, 12, 12, 64) 0 \n",
|
| 151 |
-
"_________________________________________________________________\n",
|
| 152 |
-
"dropout_1 (Dropout) (None, 12, 12, 64) 0 \n",
|
| 153 |
-
"_________________________________________________________________\n",
|
| 154 |
-
"conv2d_4 (Conv2D) (None, 12, 12, 128) 73856 \n",
|
| 155 |
-
"_________________________________________________________________\n",
|
| 156 |
-
"activation_4 (Activation) (None, 12, 12, 128) 0 \n",
|
| 157 |
-
"_________________________________________________________________\n",
|
| 158 |
-
"batch_normalization_4 (Batch (None, 12, 12, 128) 512 \n",
|
| 159 |
-
"_________________________________________________________________\n",
|
| 160 |
-
"conv2d_5 (Conv2D) (None, 12, 12, 128) 147584 \n",
|
| 161 |
-
"_________________________________________________________________\n",
|
| 162 |
-
"activation_5 (Activation) (None, 12, 12, 128) 0 \n",
|
| 163 |
-
"_________________________________________________________________\n",
|
| 164 |
-
"batch_normalization_5 (Batch (None, 12, 12, 128) 512 \n",
|
| 165 |
-
"_________________________________________________________________\n",
|
| 166 |
-
"max_pooling2d_2 (MaxPooling2 (None, 6, 6, 128) 0 \n",
|
| 167 |
-
"_________________________________________________________________\n",
|
| 168 |
-
"dropout_2 (Dropout) (None, 6, 6, 128) 0 \n",
|
| 169 |
-
"_________________________________________________________________\n",
|
| 170 |
-
"conv2d_6 (Conv2D) (None, 6, 6, 256) 295168 \n",
|
| 171 |
-
"_________________________________________________________________\n",
|
| 172 |
-
"activation_6 (Activation) (None, 6, 6, 256) 0 \n",
|
| 173 |
-
"_________________________________________________________________\n",
|
| 174 |
-
"batch_normalization_6 (Batch (None, 6, 6, 256) 1024 \n",
|
| 175 |
-
"_________________________________________________________________\n",
|
| 176 |
-
"conv2d_7 (Conv2D) (None, 6, 6, 256) 590080 \n",
|
| 177 |
-
"_________________________________________________________________\n",
|
| 178 |
-
"activation_7 (Activation) (None, 6, 6, 256) 0 \n",
|
| 179 |
-
"_________________________________________________________________\n",
|
| 180 |
-
"batch_normalization_7 (Batch (None, 6, 6, 256) 1024 \n",
|
| 181 |
-
"_________________________________________________________________\n",
|
| 182 |
-
"max_pooling2d_3 (MaxPooling2 (None, 3, 3, 256) 0 \n",
|
| 183 |
-
"_________________________________________________________________\n",
|
| 184 |
-
"dropout_3 (Dropout) (None, 3, 3, 256) 0 \n",
|
| 185 |
-
"_________________________________________________________________\n",
|
| 186 |
-
"flatten (Flatten) (None, 2304) 0 \n",
|
| 187 |
-
"_________________________________________________________________\n",
|
| 188 |
-
"dense (Dense) (None, 64) 147520 \n",
|
| 189 |
-
"_________________________________________________________________\n",
|
| 190 |
-
"activation_8 (Activation) (None, 64) 0 \n",
|
| 191 |
-
"_________________________________________________________________\n",
|
| 192 |
-
"batch_normalization_8 (Batch (None, 64) 256 \n",
|
| 193 |
-
"_________________________________________________________________\n",
|
| 194 |
-
"dropout_4 (Dropout) (None, 64) 0 \n",
|
| 195 |
-
"_________________________________________________________________\n",
|
| 196 |
-
"dense_1 (Dense) (None, 64) 4160 \n",
|
| 197 |
-
"_________________________________________________________________\n",
|
| 198 |
-
"activation_9 (Activation) (None, 64) 0 \n",
|
| 199 |
-
"_________________________________________________________________\n",
|
| 200 |
-
"batch_normalization_9 (Batch (None, 64) 256 \n",
|
| 201 |
-
"_________________________________________________________________\n",
|
| 202 |
-
"dropout_5 (Dropout) (None, 64) 0 \n",
|
| 203 |
-
"_________________________________________________________________\n",
|
| 204 |
-
"dense_2 (Dense) (None, 4) 260 \n",
|
| 205 |
-
"_________________________________________________________________\n",
|
| 206 |
-
"activation_10 (Activation) (None, 4) 0 \n",
|
| 207 |
-
"=================================================================\n",
|
| 208 |
-
"Total params: 1,328,548\n",
|
| 209 |
-
"Trainable params: 1,326,372\n",
|
| 210 |
-
"Non-trainable params: 2,176\n",
|
| 211 |
-
"_________________________________________________________________\n",
|
| 212 |
-
"None\n"
|
| 213 |
-
]
|
| 214 |
-
}
|
| 215 |
-
],
|
| 216 |
-
"source": [
|
| 217 |
-
"model = Sequential()\n",
|
| 218 |
-
"\n",
|
| 219 |
-
"model.add(Conv2D(32, (3, 3), padding = 'same', kernel_initializer=\"he_normal\",\n",
|
| 220 |
-
" input_shape = (img_rows, img_cols, 3)))\n",
|
| 221 |
-
"model.add(Activation('elu'))\n",
|
| 222 |
-
"model.add(BatchNormalization())\n",
|
| 223 |
-
"model.add(Conv2D(32, (3, 3), padding = \"same\", kernel_initializer=\"he_normal\", \n",
|
| 224 |
-
" input_shape = (img_rows, img_cols, 3)))\n",
|
| 225 |
-
"model.add(Activation('elu'))\n",
|
| 226 |
-
"model.add(BatchNormalization())\n",
|
| 227 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 228 |
-
"model.add(Dropout(0.2))\n",
|
| 229 |
-
"\n",
|
| 230 |
-
"# Block #2: second CONV => RELU => CONV => RELU => POOL\n",
|
| 231 |
-
"# layer set\n",
|
| 232 |
-
"model.add(Conv2D(64, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 233 |
-
"model.add(Activation('elu'))\n",
|
| 234 |
-
"model.add(BatchNormalization())\n",
|
| 235 |
-
"model.add(Conv2D(64, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 236 |
-
"model.add(Activation('elu'))\n",
|
| 237 |
-
"model.add(BatchNormalization())\n",
|
| 238 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 239 |
-
"model.add(Dropout(0.2))\n",
|
| 240 |
-
"\n",
|
| 241 |
-
"# Block #3: third CONV => RELU => CONV => RELU => POOL\n",
|
| 242 |
-
"# layer set\n",
|
| 243 |
-
"model.add(Conv2D(128, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 244 |
-
"model.add(Activation('elu'))\n",
|
| 245 |
-
"model.add(BatchNormalization())\n",
|
| 246 |
-
"model.add(Conv2D(128, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 247 |
-
"model.add(Activation('elu'))\n",
|
| 248 |
-
"model.add(BatchNormalization())\n",
|
| 249 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 250 |
-
"model.add(Dropout(0.2))\n",
|
| 251 |
-
"\n",
|
| 252 |
-
"# Block #4: third CONV => RELU => CONV => RELU => POOL\n",
|
| 253 |
-
"# layer set\n",
|
| 254 |
-
"model.add(Conv2D(256, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 255 |
-
"model.add(Activation('elu'))\n",
|
| 256 |
-
"model.add(BatchNormalization())\n",
|
| 257 |
-
"model.add(Conv2D(256, (3, 3), padding=\"same\", kernel_initializer=\"he_normal\"))\n",
|
| 258 |
-
"model.add(Activation('elu'))\n",
|
| 259 |
-
"model.add(BatchNormalization())\n",
|
| 260 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 261 |
-
"model.add(Dropout(0.2))\n",
|
| 262 |
-
"\n",
|
| 263 |
-
"# Block #5: first set of FC => RELU layers\n",
|
| 264 |
-
"model.add(Flatten())\n",
|
| 265 |
-
"model.add(Dense(64, kernel_initializer=\"he_normal\"))\n",
|
| 266 |
-
"model.add(Activation('elu'))\n",
|
| 267 |
-
"model.add(BatchNormalization())\n",
|
| 268 |
-
"model.add(Dropout(0.5))\n",
|
| 269 |
-
"\n",
|
| 270 |
-
"# Block #6: second set of FC => RELU layers\n",
|
| 271 |
-
"model.add(Dense(64, kernel_initializer=\"he_normal\"))\n",
|
| 272 |
-
"model.add(Activation('elu'))\n",
|
| 273 |
-
"model.add(BatchNormalization())\n",
|
| 274 |
-
"model.add(Dropout(0.5))\n",
|
| 275 |
-
"\n",
|
| 276 |
-
"# Block #7: softmax classifier\n",
|
| 277 |
-
"model.add(Dense(num_classes, kernel_initializer=\"he_normal\"))\n",
|
| 278 |
-
"model.add(Activation(\"softmax\"))\n",
|
| 279 |
-
"\n",
|
| 280 |
-
"print(model.summary())"
|
| 281 |
-
]
|
| 282 |
-
},
|
| 283 |
-
{
|
| 284 |
-
"cell_type": "markdown",
|
| 285 |
-
"metadata": {},
|
| 286 |
-
"source": [
|
| 287 |
-
"### Training our Model"
|
| 288 |
-
]
|
| 289 |
-
},
|
| 290 |
-
{
|
| 291 |
-
"cell_type": "code",
|
| 292 |
-
"execution_count": 7,
|
| 293 |
-
"metadata": {},
|
| 294 |
-
"outputs": [
|
| 295 |
-
{
|
| 296 |
-
"name": "stdout",
|
| 297 |
-
"output_type": "stream",
|
| 298 |
-
"text": [
|
| 299 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 300 |
-
" ...\n",
|
| 301 |
-
" to \n",
|
| 302 |
-
" ['...']\n",
|
| 303 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 304 |
-
" ...\n",
|
| 305 |
-
" to \n",
|
| 306 |
-
" ['...']\n",
|
| 307 |
-
"Train for 166 steps, validate for 59 steps\n",
|
| 308 |
-
"165/166 [============================>.] - ETA: 0s - loss: 0.8249 - accuracy: 0.6743\n",
|
| 309 |
-
"Epoch 00001: val_loss improved from inf to 2.41761, saving model to face_recognition_friends_vgg.h5\n",
|
| 310 |
-
"166/166 [==============================] - 73s 437ms/step - loss: 0.8229 - accuracy: 0.6747 - val_loss: 2.4176 - val_accuracy: 0.3623\n"
|
| 311 |
-
]
|
| 312 |
-
}
|
| 313 |
-
],
|
| 314 |
-
"source": [
|
| 315 |
-
"from tensorflow.keras.optimizers import RMSprop, SGD, Adam\n",
|
| 316 |
-
"from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau\n",
|
| 317 |
-
"\n",
|
| 318 |
-
" \n",
|
| 319 |
-
"checkpoint = ModelCheckpoint(\"face_recognition_friends_vgg.h5\",\n",
|
| 320 |
-
" monitor=\"val_loss\",\n",
|
| 321 |
-
" mode=\"min\",\n",
|
| 322 |
-
" save_best_only = True,\n",
|
| 323 |
-
" verbose=1)\n",
|
| 324 |
-
"\n",
|
| 325 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', \n",
|
| 326 |
-
" min_delta = 0, \n",
|
| 327 |
-
" patience = 3,\n",
|
| 328 |
-
" verbose = 1,\n",
|
| 329 |
-
" restore_best_weights = True)\n",
|
| 330 |
-
"\n",
|
| 331 |
-
"reduce_lr = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.2, patience = 3, verbose = 1, min_delta = 0.0001)\n",
|
| 332 |
-
"\n",
|
| 333 |
-
"# we put our call backs into a callback list\n",
|
| 334 |
-
"callbacks = [earlystop, checkpoint, reduce_lr]\n",
|
| 335 |
-
"\n",
|
| 336 |
-
"# We use a very small learning rate \n",
|
| 337 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 338 |
-
" optimizer = Adam(lr=0.01),\n",
|
| 339 |
-
" metrics = ['accuracy'])\n",
|
| 340 |
-
"\n",
|
| 341 |
-
"nb_train_samples = 2663\n",
|
| 342 |
-
"nb_validation_samples = 955\n",
|
| 343 |
-
"epochs = 1\n",
|
| 344 |
-
"\n",
|
| 345 |
-
"history = model.fit_generator(\n",
|
| 346 |
-
" train_generator,\n",
|
| 347 |
-
" steps_per_epoch = nb_train_samples // batch_size,\n",
|
| 348 |
-
" epochs = epochs,\n",
|
| 349 |
-
" callbacks = callbacks,\n",
|
| 350 |
-
" validation_data = validation_generator,\n",
|
| 351 |
-
" validation_steps = nb_validation_samples // batch_size)"
|
| 352 |
-
]
|
| 353 |
-
},
|
| 354 |
-
{
|
| 355 |
-
"cell_type": "markdown",
|
| 356 |
-
"metadata": {},
|
| 357 |
-
"source": [
|
| 358 |
-
"#### Getting our Class Labels"
|
| 359 |
-
]
|
| 360 |
-
},
|
| 361 |
-
{
|
| 362 |
-
"cell_type": "code",
|
| 363 |
-
"execution_count": 8,
|
| 364 |
-
"metadata": {},
|
| 365 |
-
"outputs": [
|
| 366 |
-
{
|
| 367 |
-
"data": {
|
| 368 |
-
"text/plain": [
|
| 369 |
-
"{0: 'Chandler', 1: 'Joey', 2: 'Pheobe', 3: 'Rachel'}"
|
| 370 |
-
]
|
| 371 |
-
},
|
| 372 |
-
"execution_count": 8,
|
| 373 |
-
"metadata": {},
|
| 374 |
-
"output_type": "execute_result"
|
| 375 |
-
}
|
| 376 |
-
],
|
| 377 |
-
"source": [
|
| 378 |
-
"class_labels = validation_generator.class_indices\n",
|
| 379 |
-
"class_labels = {v: k for k, v in class_labels.items()}\n",
|
| 380 |
-
"classes = list(class_labels.values())\n",
|
| 381 |
-
"class_labels"
|
| 382 |
-
]
|
| 383 |
-
},
|
| 384 |
-
{
|
| 385 |
-
"cell_type": "code",
|
| 386 |
-
"execution_count": 9,
|
| 387 |
-
"metadata": {},
|
| 388 |
-
"outputs": [],
|
| 389 |
-
"source": [
|
| 390 |
-
"# Load our model\n",
|
| 391 |
-
"from tensorflow.keras.models import load_model\n",
|
| 392 |
-
"\n",
|
| 393 |
-
"classifier = load_model('face_recognition_friends_vgg.h5')"
|
| 394 |
-
]
|
| 395 |
-
},
|
| 396 |
-
{
|
| 397 |
-
"cell_type": "markdown",
|
| 398 |
-
"metadata": {},
|
| 399 |
-
"source": [
|
| 400 |
-
"### Testing our model on some real video"
|
| 401 |
-
]
|
| 402 |
-
},
|
| 403 |
-
{
|
| 404 |
-
"cell_type": "code",
|
| 405 |
-
"execution_count": 10,
|
| 406 |
-
"metadata": {},
|
| 407 |
-
"outputs": [
|
| 408 |
-
{
|
| 409 |
-
"ename": "NameError",
|
| 410 |
-
"evalue": "name 'dlib' is not defined",
|
| 411 |
-
"output_type": "error",
|
| 412 |
-
"traceback": [
|
| 413 |
-
"\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
|
| 414 |
-
"\u001b[1;31mNameError\u001b[0m Traceback (most recent call last)",
|
| 415 |
-
"\u001b[1;32m<ipython-input-10-d316c51cf110>\u001b[0m in \u001b[0;36m<module>\u001b[1;34m\u001b[0m\n\u001b[0;32m 19\u001b[0m \u001b[0mimg_size\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;36m64\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 20\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m---> 21\u001b[1;33m \u001b[0mdetector\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mdlib\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mget_frontal_face_detector\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 22\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 23\u001b[0m \u001b[0mcap\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mcv2\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mVideoCapture\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;34m'testfriends.mp4'\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n",
|
| 416 |
-
"\u001b[1;31mNameError\u001b[0m: name 'dlib' is not defined"
|
| 417 |
-
]
|
| 418 |
-
}
|
| 419 |
-
],
|
| 420 |
-
"source": [
|
| 421 |
-
"from os import listdir\n",
|
| 422 |
-
"from os.path import isfile, join\n",
|
| 423 |
-
"import os\n",
|
| 424 |
-
"import cv2\n",
|
| 425 |
-
"import numpy as np\n",
|
| 426 |
-
"\n",
|
| 427 |
-
"\n",
|
| 428 |
-
"face_classes = {0: 'Chandler', 1: 'Joey', 2: 'Pheobe', 3: 'Rachel'}\n",
|
| 429 |
-
"\n",
|
| 430 |
-
"def draw_label(image, point, label, font=cv2.FONT_HERSHEY_SIMPLEX,\n",
|
| 431 |
-
" font_scale=0.8, thickness=1):\n",
|
| 432 |
-
" size = cv2.getTextSize(label, font, font_scale, thickness)[0]\n",
|
| 433 |
-
" x, y = point\n",
|
| 434 |
-
" cv2.rectangle(image, (x, y - size[1]), (x + size[0], y), (255, 0, 0), cv2.FILLED)\n",
|
| 435 |
-
" cv2.putText(image, label, point, font, font_scale, (255, 255, 255), thickness, lineType=cv2.LINE_AA)\n",
|
| 436 |
-
" \n",
|
| 437 |
-
"margin = 0.2\n",
|
| 438 |
-
"# load model and weights\n",
|
| 439 |
-
"img_size = 64\n",
|
| 440 |
-
"\n",
|
| 441 |
-
"detector = dlib.get_frontal_face_detector()\n",
|
| 442 |
-
"\n",
|
| 443 |
-
"cap = cv2.VideoCapture('testfriends.mp4')\n",
|
| 444 |
-
"\n",
|
| 445 |
-
"while True:\n",
|
| 446 |
-
" ret, frame = cap.read()\n",
|
| 447 |
-
" frame = cv2.resize(frame, None, fx=0.5, fy=0.5, interpolation = cv2.INTER_LINEAR)\n",
|
| 448 |
-
" preprocessed_faces = [] \n",
|
| 449 |
-
" \n",
|
| 450 |
-
" input_img = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)\n",
|
| 451 |
-
" img_h, img_w, _ = np.shape(input_img)\n",
|
| 452 |
-
" detected = detector(frame, 1)\n",
|
| 453 |
-
" faces = np.empty((len(detected), img_size, img_size, 3))\n",
|
| 454 |
-
" \n",
|
| 455 |
-
" preprocessed_faces_emo = []\n",
|
| 456 |
-
" if len(detected) > 0:\n",
|
| 457 |
-
" for i, d in enumerate(detected):\n",
|
| 458 |
-
" x1, y1, x2, y2, w, h = d.left(), d.top(), d.right() + 1, d.bottom() + 1, d.width(), d.height()\n",
|
| 459 |
-
" xw1 = max(int(x1 - margin * w), 0)\n",
|
| 460 |
-
" yw1 = max(int(y1 - margin * h), 0)\n",
|
| 461 |
-
" xw2 = min(int(x2 + margin * w), img_w - 1)\n",
|
| 462 |
-
" yw2 = min(int(y2 + margin * h), img_h - 1)\n",
|
| 463 |
-
" cv2.rectangle(frame, (x1, y1), (x2, y2), (255, 0, 0), 2)\n",
|
| 464 |
-
" # cv2.rectangle(img, (xw1, yw1), (xw2, yw2), (255, 0, 0), 2)\n",
|
| 465 |
-
" #faces[i, :, :, :] = cv2.resize(frame[yw1:yw2 + 1, xw1:xw2 + 1, :], (img_size, img_size))\n",
|
| 466 |
-
" face = frame[yw1:yw2 + 1, xw1:xw2 + 1, :]\n",
|
| 467 |
-
" face = cv2.resize(face, (48, 48), interpolation = cv2.INTER_AREA)\n",
|
| 468 |
-
" face = face.astype(\"float\") / 255.0\n",
|
| 469 |
-
" face = img_to_array(face)\n",
|
| 470 |
-
" face = np.expand_dims(face, axis=0)\n",
|
| 471 |
-
" preprocessed_faces.append(face)\n",
|
| 472 |
-
"\n",
|
| 473 |
-
" # make a prediction for Emotion \n",
|
| 474 |
-
" face_labels = []\n",
|
| 475 |
-
" for i, d in enumerate(detected):\n",
|
| 476 |
-
" preds = classifier.predict(preprocessed_faces[i])[0]\n",
|
| 477 |
-
" face_labels.append(face_classes[preds.argmax()])\n",
|
| 478 |
-
" \n",
|
| 479 |
-
" # draw results\n",
|
| 480 |
-
" for i, d in enumerate(detected):\n",
|
| 481 |
-
" label = \"{}\".format(face_labels[i])\n",
|
| 482 |
-
" draw_label(frame, (d.left(), d.top()), label)\n",
|
| 483 |
-
"\n",
|
| 484 |
-
" cv2.imshow(\"Friend Character Identifier\", frame)\n",
|
| 485 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 486 |
-
" break\n",
|
| 487 |
-
"\n",
|
| 488 |
-
"cap.release()\n",
|
| 489 |
-
"cv2.destroyAllWindows() "
|
| 490 |
-
]
|
| 491 |
-
},
|
| 492 |
-
{
|
| 493 |
-
"cell_type": "code",
|
| 494 |
-
"execution_count": null,
|
| 495 |
-
"metadata": {},
|
| 496 |
-
"outputs": [],
|
| 497 |
-
"source": []
|
| 498 |
-
}
|
| 499 |
-
],
|
| 500 |
-
"metadata": {
|
| 501 |
-
"kernelspec": {
|
| 502 |
-
"display_name": "Python 3",
|
| 503 |
-
"language": "python",
|
| 504 |
-
"name": "python3"
|
| 505 |
-
},
|
| 506 |
-
"language_info": {
|
| 507 |
-
"codemirror_mode": {
|
| 508 |
-
"name": "ipython",
|
| 509 |
-
"version": 3
|
| 510 |
-
},
|
| 511 |
-
"file_extension": ".py",
|
| 512 |
-
"mimetype": "text/x-python",
|
| 513 |
-
"name": "python",
|
| 514 |
-
"nbconvert_exporter": "python",
|
| 515 |
-
"pygments_lexer": "ipython3",
|
| 516 |
-
"version": "3.7.4"
|
| 517 |
-
}
|
| 518 |
-
},
|
| 519 |
-
"nbformat": 4,
|
| 520 |
-
"nbformat_minor": 2
|
| 521 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:784af02a73ac8697fc56342a62f4a599fcbddcb44476d64e828af3e086f4b624
|
| 3 |
+
size 23012
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25. Face Recognition/25.2 Face Recogition - Matching Faces.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
25. Face Recognition/25.3 Face Recogition - One Shot Learning.ipynb
CHANGED
|
@@ -1,406 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"## 1. Extract faces from pictures of people \n",
|
| 8 |
-
"### Instrutions:\n",
|
| 9 |
-
"- Place photos of people (one face visible) in the folder called \"./people\"\n",
|
| 10 |
-
"- Replace my photo titled \"Rajeev.jpg\" with a piture of your face for testing on a webcam\n",
|
| 11 |
-
"- Faces are extracted using the haarcascade_frontalface_default detector model\n",
|
| 12 |
-
"- Extracted faces are placed in the folder called \"./group_of_faces\"\n",
|
| 13 |
-
"#### We are extracting the faces needed for our one-shot learning model, it will load 5 extracted faces"
|
| 14 |
-
]
|
| 15 |
-
},
|
| 16 |
-
{
|
| 17 |
-
"cell_type": "code",
|
| 18 |
-
"execution_count": 1,
|
| 19 |
-
"metadata": {},
|
| 20 |
-
"outputs": [
|
| 21 |
-
{
|
| 22 |
-
"name": "stdout",
|
| 23 |
-
"output_type": "stream",
|
| 24 |
-
"text": [
|
| 25 |
-
"Collected image names\n"
|
| 26 |
-
]
|
| 27 |
-
}
|
| 28 |
-
],
|
| 29 |
-
"source": [
|
| 30 |
-
"# The code below extracts faces from images and places them in the folder\n",
|
| 31 |
-
"from os import listdir\n",
|
| 32 |
-
"from os.path import isfile, join\n",
|
| 33 |
-
"import cv2\n",
|
| 34 |
-
"\n",
|
| 35 |
-
"# Loading out HAARCascade Face Detector \n",
|
| 36 |
-
"face_detector = cv2.CascadeClassifier('Haarcascades/haarcascade_frontalface_default.xml')\n",
|
| 37 |
-
"\n",
|
| 38 |
-
"# Directory of image of persons we'll be extracting faces frommy\n",
|
| 39 |
-
"mypath = \"./people/\"\n",
|
| 40 |
-
"image_file_names = [f for f in listdir(mypath) if isfile(join(mypath, f))]\n",
|
| 41 |
-
"print(\"Collected image names\")\n",
|
| 42 |
-
"\n",
|
| 43 |
-
"for image_name in image_file_names:\n",
|
| 44 |
-
" person_image = cv2.imread(mypath+image_name)\n",
|
| 45 |
-
" face_info = face_detector.detectMultiScale(person_image, 1.3, 5)\n",
|
| 46 |
-
" for (x,y,w,h) in face_info:\n",
|
| 47 |
-
" face = person_image[y:y+h, x:x+w]\n",
|
| 48 |
-
" roi = cv2.resize(face, (128, 128), interpolation = cv2.INTER_CUBIC)\n",
|
| 49 |
-
" path = \"./group_of_faces/\" + \"face_\" + image_name \n",
|
| 50 |
-
" cv2.imwrite(path, roi)\n",
|
| 51 |
-
" cv2.imshow(\"face\", roi)\n",
|
| 52 |
-
" \n",
|
| 53 |
-
" cv2.waitKey(0)\n",
|
| 54 |
-
"cv2.destroyAllWindows()"
|
| 55 |
-
]
|
| 56 |
-
},
|
| 57 |
-
{
|
| 58 |
-
"cell_type": "markdown",
|
| 59 |
-
"metadata": {},
|
| 60 |
-
"source": [
|
| 61 |
-
"## 2. Load our VGGFaceModel \n",
|
| 62 |
-
"- This block of code defines the VGGFace model (which we use later) and loads the model"
|
| 63 |
-
]
|
| 64 |
-
},
|
| 65 |
-
{
|
| 66 |
-
"cell_type": "code",
|
| 67 |
-
"execution_count": 3,
|
| 68 |
-
"metadata": {},
|
| 69 |
-
"outputs": [
|
| 70 |
-
{
|
| 71 |
-
"name": "stdout",
|
| 72 |
-
"output_type": "stream",
|
| 73 |
-
"text": [
|
| 74 |
-
"Model Loaded\n"
|
| 75 |
-
]
|
| 76 |
-
}
|
| 77 |
-
],
|
| 78 |
-
"source": [
|
| 79 |
-
"#author Sefik Ilkin Serengil\n",
|
| 80 |
-
"#you can find the documentation of this code from the following link: https://sefiks.com/2018/08/06/deep-face-recognition-with-keras/\n",
|
| 81 |
-
"\n",
|
| 82 |
-
"import numpy as np\n",
|
| 83 |
-
"import cv2\n",
|
| 84 |
-
"\n",
|
| 85 |
-
"from tensorflow.keras.models import Model, Sequential\n",
|
| 86 |
-
"from tensorflow.keras.layers import Input, Convolution2D, ZeroPadding2D, MaxPooling2D, Flatten, Dense, Dropout, Activation\n",
|
| 87 |
-
"from PIL import Image\n",
|
| 88 |
-
"from tensorflow.keras.preprocessing.image import load_img, save_img, img_to_array\n",
|
| 89 |
-
"from tensorflow.keras.applications.imagenet_utils import preprocess_input\n",
|
| 90 |
-
"from tensorflow.keras.preprocessing import image\n",
|
| 91 |
-
"import matplotlib.pyplot as plt\n",
|
| 92 |
-
"from os import listdir\n",
|
| 93 |
-
"\n",
|
| 94 |
-
"def preprocess_image(image_path):\n",
|
| 95 |
-
" \"\"\"Loads image from path and resizes it\"\"\"\n",
|
| 96 |
-
" img = load_img(image_path, target_size=(224, 224))\n",
|
| 97 |
-
" img = img_to_array(img)\n",
|
| 98 |
-
" img = np.expand_dims(img, axis=0)\n",
|
| 99 |
-
" img = preprocess_input(img)\n",
|
| 100 |
-
" return img\n",
|
| 101 |
-
"\n",
|
| 102 |
-
"model = Sequential()\n",
|
| 103 |
-
"model.add(ZeroPadding2D((1,1),input_shape=(224,224, 3)))\n",
|
| 104 |
-
"model.add(Convolution2D(64, (3, 3), activation='relu'))\n",
|
| 105 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 106 |
-
"model.add(Convolution2D(64, (3, 3), activation='relu'))\n",
|
| 107 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 108 |
-
"\n",
|
| 109 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 110 |
-
"model.add(Convolution2D(128, (3, 3), activation='relu'))\n",
|
| 111 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 112 |
-
"model.add(Convolution2D(128, (3, 3), activation='relu'))\n",
|
| 113 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 114 |
-
"\n",
|
| 115 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 116 |
-
"model.add(Convolution2D(256, (3, 3), activation='relu'))\n",
|
| 117 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 118 |
-
"model.add(Convolution2D(256, (3, 3), activation='relu'))\n",
|
| 119 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 120 |
-
"model.add(Convolution2D(256, (3, 3), activation='relu'))\n",
|
| 121 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 122 |
-
"\n",
|
| 123 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 124 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 125 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 126 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 127 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 128 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 129 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 130 |
-
"\n",
|
| 131 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 132 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 133 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 134 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 135 |
-
"model.add(ZeroPadding2D((1,1)))\n",
|
| 136 |
-
"model.add(Convolution2D(512, (3, 3), activation='relu'))\n",
|
| 137 |
-
"model.add(MaxPooling2D((2,2), strides=(2,2)))\n",
|
| 138 |
-
"\n",
|
| 139 |
-
"model.add(Convolution2D(4096, (7, 7), activation='relu'))\n",
|
| 140 |
-
"model.add(Dropout(0.5))\n",
|
| 141 |
-
"model.add(Convolution2D(4096, (1, 1), activation='relu'))\n",
|
| 142 |
-
"model.add(Dropout(0.5))\n",
|
| 143 |
-
"model.add(Convolution2D(2622, (1, 1)))\n",
|
| 144 |
-
"model.add(Flatten())\n",
|
| 145 |
-
"model.add(Activation('softmax'))\n",
|
| 146 |
-
"\n",
|
| 147 |
-
"#you can download pretrained weights from https://drive.google.com/file/d/1CPSeum3HpopfomUEK1gybeuIVoeJT_Eo/view?usp=sharing\n",
|
| 148 |
-
"from tensorflow.keras.models import model_from_json\n",
|
| 149 |
-
"model.load_weights('vgg_face_weights.h5')\n",
|
| 150 |
-
"\n",
|
| 151 |
-
"vgg_face_descriptor = Model(inputs=model.layers[0].input, outputs=model.layers[-2].output)\n",
|
| 152 |
-
"\n",
|
| 153 |
-
"model = vgg_face_descriptor\n",
|
| 154 |
-
"\n",
|
| 155 |
-
" \n",
|
| 156 |
-
"print(\"Model Loaded\")"
|
| 157 |
-
]
|
| 158 |
-
},
|
| 159 |
-
{
|
| 160 |
-
"cell_type": "markdown",
|
| 161 |
-
"metadata": {},
|
| 162 |
-
"source": [
|
| 163 |
-
"## 3. Test model using your Webcam\n",
|
| 164 |
-
"This code looks up the faces you extracted in the \"group_of_faces\" folder and uses the similarity (Cosine Similarity) to detect which faces is most similar to the one being extracted with your webcam."
|
| 165 |
-
]
|
| 166 |
-
},
|
| 167 |
-
{
|
| 168 |
-
"cell_type": "code",
|
| 169 |
-
"execution_count": 4,
|
| 170 |
-
"metadata": {},
|
| 171 |
-
"outputs": [
|
| 172 |
-
{
|
| 173 |
-
"name": "stdout",
|
| 174 |
-
"output_type": "stream",
|
| 175 |
-
"text": [
|
| 176 |
-
"Face representations retrieved successfully\n"
|
| 177 |
-
]
|
| 178 |
-
}
|
| 179 |
-
],
|
| 180 |
-
"source": [
|
| 181 |
-
"#points to your extracted faces\n",
|
| 182 |
-
"people_pictures = \"./group_of_faces/\"\n",
|
| 183 |
-
"\n",
|
| 184 |
-
"all_people_faces = dict()\n",
|
| 185 |
-
"\n",
|
| 186 |
-
"for file in listdir(people_pictures):\n",
|
| 187 |
-
" person_face, extension = file.split(\".\")\n",
|
| 188 |
-
" all_people_faces[person_face] = model.predict(preprocess_image('./group_of_faces/%s.jpg' % (person_face)))[0,:]\n",
|
| 189 |
-
"\n",
|
| 190 |
-
"print(\"Face representations retrieved successfully\")\n",
|
| 191 |
-
"\n",
|
| 192 |
-
"def findCosineSimilarity(source_representation, test_representation):\n",
|
| 193 |
-
" a = np.matmul(np.transpose(source_representation), test_representation)\n",
|
| 194 |
-
" b = np.sum(np.multiply(source_representation, source_representation))\n",
|
| 195 |
-
" c = np.sum(np.multiply(test_representation, test_representation))\n",
|
| 196 |
-
" return 1 - (a / (np.sqrt(b) * np.sqrt(c)))\n",
|
| 197 |
-
"\n",
|
| 198 |
-
"#Open Webcam\n",
|
| 199 |
-
"cap = cv2.VideoCapture(0) \n",
|
| 200 |
-
"\n",
|
| 201 |
-
"while(True):\n",
|
| 202 |
-
" ret, img = cap.read()\n",
|
| 203 |
-
" faces = face_detector.detectMultiScale(img, 1.3, 5)\n",
|
| 204 |
-
"\n",
|
| 205 |
-
" for (x,y,w,h) in faces:\n",
|
| 206 |
-
" if w > 100: #Adjust accordingly if your webcam resoluation is higher\n",
|
| 207 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2) #draw rectangle to main image\n",
|
| 208 |
-
" detected_face = img[int(y):int(y+h), int(x):int(x+w)] #crop detected face\n",
|
| 209 |
-
" detected_face = cv2.resize(detected_face, (224, 224)) #resize to 224x224\n",
|
| 210 |
-
"\n",
|
| 211 |
-
" img_pixels = image.img_to_array(detected_face)\n",
|
| 212 |
-
" img_pixels = np.expand_dims(img_pixels, axis = 0)\n",
|
| 213 |
-
" img_pixels /= 255\n",
|
| 214 |
-
"\n",
|
| 215 |
-
" captured_representation = model.predict(img_pixels)[0,:]\n",
|
| 216 |
-
"\n",
|
| 217 |
-
" found = 0\n",
|
| 218 |
-
" for i in all_people_faces:\n",
|
| 219 |
-
" person_name = i\n",
|
| 220 |
-
" representation = all_people_faces[i]\n",
|
| 221 |
-
"\n",
|
| 222 |
-
" similarity = findCosineSimilarity(representation, captured_representation)\n",
|
| 223 |
-
" if(similarity < 0.30):\n",
|
| 224 |
-
" cv2.putText(img, person_name[5:], (int(x+w+15), int(y-12)), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 0, 255), 2)\n",
|
| 225 |
-
" found = 1\n",
|
| 226 |
-
" break\n",
|
| 227 |
-
"\n",
|
| 228 |
-
" #connect face and text\n",
|
| 229 |
-
" cv2.line(img,(int((x+x+w)/2),y+15),(x+w,y-20),(255, 0, 0),1)\n",
|
| 230 |
-
" cv2.line(img,(x+w,y-20),(x+w+10,y-20),(255, 0, 0),1)\n",
|
| 231 |
-
"\n",
|
| 232 |
-
" if(found == 0): #if found image is not in our people database\n",
|
| 233 |
-
" cv2.putText(img, 'unknown', (int(x+w+15), int(y-12)), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2)\n",
|
| 234 |
-
"\n",
|
| 235 |
-
" cv2.imshow('img',img)\n",
|
| 236 |
-
"\n",
|
| 237 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 238 |
-
" break\n",
|
| 239 |
-
" \n",
|
| 240 |
-
"cap.release()\n",
|
| 241 |
-
"cv2.destroyAllWindows()"
|
| 242 |
-
]
|
| 243 |
-
},
|
| 244 |
-
{
|
| 245 |
-
"cell_type": "markdown",
|
| 246 |
-
"metadata": {},
|
| 247 |
-
"source": [
|
| 248 |
-
"## Test on a video\n",
|
| 249 |
-
"### Since we're using the Friends TV Series characters, let's extract the faces from the images I placed in the \"./friends\" folder"
|
| 250 |
-
]
|
| 251 |
-
},
|
| 252 |
-
{
|
| 253 |
-
"cell_type": "code",
|
| 254 |
-
"execution_count": 5,
|
| 255 |
-
"metadata": {},
|
| 256 |
-
"outputs": [
|
| 257 |
-
{
|
| 258 |
-
"name": "stdout",
|
| 259 |
-
"output_type": "stream",
|
| 260 |
-
"text": [
|
| 261 |
-
"Collected image names\n"
|
| 262 |
-
]
|
| 263 |
-
}
|
| 264 |
-
],
|
| 265 |
-
"source": [
|
| 266 |
-
"from os import listdir\n",
|
| 267 |
-
"from os.path import isfile, join\n",
|
| 268 |
-
"import cv2\n",
|
| 269 |
-
"\n",
|
| 270 |
-
"# Loading out HAARCascade Face Detector \n",
|
| 271 |
-
"face_detector = cv2.CascadeClassifier('Haarcascades/haarcascade_frontalface_default.xml')\n",
|
| 272 |
-
"\n",
|
| 273 |
-
"# Directory of image of persons we'll be extracting faces frommy\n",
|
| 274 |
-
"mypath = \"./friends/\"\n",
|
| 275 |
-
"image_file_names = [f for f in listdir(mypath) if isfile(join(mypath, f))]\n",
|
| 276 |
-
"print(\"Collected image names\")\n",
|
| 277 |
-
"\n",
|
| 278 |
-
"for image_name in image_file_names:\n",
|
| 279 |
-
" person_image = cv2.imread(mypath+image_name)\n",
|
| 280 |
-
" face_info = face_detector.detectMultiScale(person_image, 1.3, 5)\n",
|
| 281 |
-
" for (x,y,w,h) in face_info:\n",
|
| 282 |
-
" face = person_image[y:y+h, x:x+w]\n",
|
| 283 |
-
" roi = cv2.resize(face, (128, 128), interpolation = cv2.INTER_CUBIC)\n",
|
| 284 |
-
" path = \"./friends_faces/\" + \"face_\" + image_name \n",
|
| 285 |
-
" cv2.imwrite(path, roi)\n",
|
| 286 |
-
" cv2.imshow(\"face\", roi)\n",
|
| 287 |
-
" \n",
|
| 288 |
-
" cv2.waitKey(0)\n",
|
| 289 |
-
"cv2.destroyAllWindows()"
|
| 290 |
-
]
|
| 291 |
-
},
|
| 292 |
-
{
|
| 293 |
-
"cell_type": "markdown",
|
| 294 |
-
"metadata": {},
|
| 295 |
-
"source": [
|
| 296 |
-
"### Again, we load our faces from the \"friends_faces\" directory and we run our face classifier model our test video"
|
| 297 |
-
]
|
| 298 |
-
},
|
| 299 |
-
{
|
| 300 |
-
"cell_type": "code",
|
| 301 |
-
"execution_count": 10,
|
| 302 |
-
"metadata": {},
|
| 303 |
-
"outputs": [
|
| 304 |
-
{
|
| 305 |
-
"name": "stdout",
|
| 306 |
-
"output_type": "stream",
|
| 307 |
-
"text": [
|
| 308 |
-
"Face representations retrieved successfully\n"
|
| 309 |
-
]
|
| 310 |
-
}
|
| 311 |
-
],
|
| 312 |
-
"source": [
|
| 313 |
-
"#points to your extracted faces\n",
|
| 314 |
-
"people_pictures = \"./friends_faces/\"\n",
|
| 315 |
-
"\n",
|
| 316 |
-
"all_people_faces = dict()\n",
|
| 317 |
-
"\n",
|
| 318 |
-
"for file in listdir(people_pictures):\n",
|
| 319 |
-
" person_face, extension = file.split(\".\")\n",
|
| 320 |
-
" all_people_faces[person_face] = model.predict(preprocess_image('./friends_faces/%s.jpg' % (person_face)))[0,:]\n",
|
| 321 |
-
"\n",
|
| 322 |
-
"print(\"Face representations retrieved successfully\")\n",
|
| 323 |
-
"\n",
|
| 324 |
-
"def findCosineSimilarity(source_representation, test_representation):\n",
|
| 325 |
-
" a = np.matmul(np.transpose(source_representation), test_representation)\n",
|
| 326 |
-
" b = np.sum(np.multiply(source_representation, source_representation))\n",
|
| 327 |
-
" c = np.sum(np.multiply(test_representation, test_representation))\n",
|
| 328 |
-
" return 1 - (a / (np.sqrt(b) * np.sqrt(c)))\n",
|
| 329 |
-
"\n",
|
| 330 |
-
"cap = cv2.VideoCapture('testfriends.mp4')\n",
|
| 331 |
-
"\n",
|
| 332 |
-
"while(True):\n",
|
| 333 |
-
" ret, img = cap.read()\n",
|
| 334 |
-
" img = cv2.resize(img, (320, 180)) # Re-size video to as smaller size to improve face detection speed\n",
|
| 335 |
-
" faces = face_detector.detectMultiScale(img, 1.3, 5)\n",
|
| 336 |
-
"\n",
|
| 337 |
-
" for (x,y,w,h) in faces:\n",
|
| 338 |
-
" if w > 13: \n",
|
| 339 |
-
" cv2.rectangle(img,(x,y),(x+w,y+h),(255,0,0),2) #draw rectangle to main image\n",
|
| 340 |
-
"\n",
|
| 341 |
-
" detected_face = img[int(y):int(y+h), int(x):int(x+w)] #crop detected face\n",
|
| 342 |
-
" detected_face = cv2.resize(detected_face, (224, 224)) #resize to 224x224\n",
|
| 343 |
-
"\n",
|
| 344 |
-
" img_pixels = image.img_to_array(detected_face)\n",
|
| 345 |
-
" img_pixels = np.expand_dims(img_pixels, axis = 0)\n",
|
| 346 |
-
" img_pixels /= 255\n",
|
| 347 |
-
"\n",
|
| 348 |
-
" captured_representation = model.predict(img_pixels)[0,:]\n",
|
| 349 |
-
"\n",
|
| 350 |
-
" found = 0\n",
|
| 351 |
-
" for i in all_people_faces:\n",
|
| 352 |
-
" person_name = i\n",
|
| 353 |
-
" representation = all_people_faces[i]\n",
|
| 354 |
-
"\n",
|
| 355 |
-
" similarity = findCosineSimilarity(representation, captured_representation)\n",
|
| 356 |
-
" if(similarity < 0.30):\n",
|
| 357 |
-
" cv2.putText(img, person_name[5:], (int(x+w+15), int(y-12)), cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 0, 255), 2)\n",
|
| 358 |
-
" found = 1\n",
|
| 359 |
-
" break\n",
|
| 360 |
-
"\n",
|
| 361 |
-
" #connect face and text\n",
|
| 362 |
-
" cv2.line(img,(int((x+x+w)/2),y+15),(x+w,y-20),(255, 0, 0),1)\n",
|
| 363 |
-
" cv2.line(img,(x+w,y-20),(x+w+10,y-20),(255, 0, 0),1)\n",
|
| 364 |
-
"\n",
|
| 365 |
-
" if(found == 0): #if found image is not in our people database\n",
|
| 366 |
-
" cv2.putText(img, 'unknown', (int(x+w+15), int(y-12)), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2)\n",
|
| 367 |
-
"\n",
|
| 368 |
-
" cv2.imshow('img',img)\n",
|
| 369 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 370 |
-
" break\n",
|
| 371 |
-
"\n",
|
| 372 |
-
"#kill open cv things\n",
|
| 373 |
-
"cap.release()\n",
|
| 374 |
-
"cv2.destroyAllWindows()"
|
| 375 |
-
]
|
| 376 |
-
},
|
| 377 |
-
{
|
| 378 |
-
"cell_type": "code",
|
| 379 |
-
"execution_count": null,
|
| 380 |
-
"metadata": {},
|
| 381 |
-
"outputs": [],
|
| 382 |
-
"source": []
|
| 383 |
-
}
|
| 384 |
-
],
|
| 385 |
-
"metadata": {
|
| 386 |
-
"kernelspec": {
|
| 387 |
-
"display_name": "Python 3",
|
| 388 |
-
"language": "python",
|
| 389 |
-
"name": "python3"
|
| 390 |
-
},
|
| 391 |
-
"language_info": {
|
| 392 |
-
"codemirror_mode": {
|
| 393 |
-
"name": "ipython",
|
| 394 |
-
"version": 3
|
| 395 |
-
},
|
| 396 |
-
"file_extension": ".py",
|
| 397 |
-
"mimetype": "text/x-python",
|
| 398 |
-
"name": "python",
|
| 399 |
-
"nbconvert_exporter": "python",
|
| 400 |
-
"pygments_lexer": "ipython3",
|
| 401 |
-
"version": "3.7.4"
|
| 402 |
-
}
|
| 403 |
-
},
|
| 404 |
-
"nbformat": 4,
|
| 405 |
-
"nbformat_minor": 2
|
| 406 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:3cd80b0b58742fc9c3563f3d11cf3e608b8b533ab498516b25072b291751f420
|
| 3 |
+
size 15335
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
26. Credit Card/26. Credit Card Reader.ipynb
CHANGED
|
@@ -1,1066 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"# 1. Let's Create Our Credit Card Dataset\n",
|
| 8 |
-
"- There two main font variations used in credit cards"
|
| 9 |
-
]
|
| 10 |
-
},
|
| 11 |
-
{
|
| 12 |
-
"cell_type": "code",
|
| 13 |
-
"execution_count": 1,
|
| 14 |
-
"metadata": {},
|
| 15 |
-
"outputs": [],
|
| 16 |
-
"source": [
|
| 17 |
-
"import cv2\n",
|
| 18 |
-
"\n",
|
| 19 |
-
"cc1 = cv2.imread('creditcard_digits1.jpg', 0)\n",
|
| 20 |
-
"cv2.imshow(\"Digits 1\", cc1)\n",
|
| 21 |
-
"cv2.waitKey(0)\n",
|
| 22 |
-
"cc2 = cv2.imread('creditcard_digits2.jpg', 0)\n",
|
| 23 |
-
"cv2.imshow(\"Digits 2\", cc2)\n",
|
| 24 |
-
"cv2.waitKey(0)\n",
|
| 25 |
-
"cv2.destroyAllWindows()"
|
| 26 |
-
]
|
| 27 |
-
},
|
| 28 |
-
{
|
| 29 |
-
"cell_type": "code",
|
| 30 |
-
"execution_count": 2,
|
| 31 |
-
"metadata": {},
|
| 32 |
-
"outputs": [],
|
| 33 |
-
"source": [
|
| 34 |
-
"cc1 = cv2.imread('creditcard_digits2.jpg', 0)\n",
|
| 35 |
-
"_, th2 = cv2.threshold(cc1, 0, 255, cv2.THRESH_BINARY + cv2.THRESH_OTSU)\n",
|
| 36 |
-
"cv2.imshow(\"Digits 2 Thresholded\", th2)\n",
|
| 37 |
-
"cv2.waitKey(0)\n",
|
| 38 |
-
" \n",
|
| 39 |
-
"cv2.destroyAllWindows()"
|
| 40 |
-
]
|
| 41 |
-
},
|
| 42 |
-
{
|
| 43 |
-
"cell_type": "markdown",
|
| 44 |
-
"metadata": {},
|
| 45 |
-
"source": [
|
| 46 |
-
"## Now let's get generate an Augumentated Dataset from these two samples \n"
|
| 47 |
-
]
|
| 48 |
-
},
|
| 49 |
-
{
|
| 50 |
-
"cell_type": "code",
|
| 51 |
-
"execution_count": 3,
|
| 52 |
-
"metadata": {},
|
| 53 |
-
"outputs": [
|
| 54 |
-
{
|
| 55 |
-
"name": "stdout",
|
| 56 |
-
"output_type": "stream",
|
| 57 |
-
"text": [
|
| 58 |
-
"./credit_card/train/0\n",
|
| 59 |
-
"./credit_card/train/1\n",
|
| 60 |
-
"./credit_card/train/2\n",
|
| 61 |
-
"./credit_card/train/3\n",
|
| 62 |
-
"./credit_card/train/4\n",
|
| 63 |
-
"./credit_card/train/5\n",
|
| 64 |
-
"./credit_card/train/6\n",
|
| 65 |
-
"./credit_card/train/7\n",
|
| 66 |
-
"./credit_card/train/8\n",
|
| 67 |
-
"./credit_card/train/9\n",
|
| 68 |
-
"./credit_card/test/0\n",
|
| 69 |
-
"./credit_card/test/1\n",
|
| 70 |
-
"./credit_card/test/2\n",
|
| 71 |
-
"./credit_card/test/3\n",
|
| 72 |
-
"./credit_card/test/4\n",
|
| 73 |
-
"./credit_card/test/5\n",
|
| 74 |
-
"./credit_card/test/6\n",
|
| 75 |
-
"./credit_card/test/7\n",
|
| 76 |
-
"./credit_card/test/8\n",
|
| 77 |
-
"./credit_card/test/9\n"
|
| 78 |
-
]
|
| 79 |
-
}
|
| 80 |
-
],
|
| 81 |
-
"source": [
|
| 82 |
-
"#Create our dataset directories\n",
|
| 83 |
-
"\n",
|
| 84 |
-
"import os\n",
|
| 85 |
-
"\n",
|
| 86 |
-
"def makedir(directory):\n",
|
| 87 |
-
" \"\"\"Creates a new directory if it does not exist\"\"\"\n",
|
| 88 |
-
" if not os.path.exists(directory):\n",
|
| 89 |
-
" os.makedirs(directory)\n",
|
| 90 |
-
" return None, 0\n",
|
| 91 |
-
" \n",
|
| 92 |
-
"for i in range(0,10):\n",
|
| 93 |
-
" directory_name = \"./credit_card/train/\"+str(i)\n",
|
| 94 |
-
" print(directory_name)\n",
|
| 95 |
-
" makedir(directory_name) \n",
|
| 96 |
-
"\n",
|
| 97 |
-
"for i in range(0,10):\n",
|
| 98 |
-
" directory_name = \"./credit_card/test/\"+str(i)\n",
|
| 99 |
-
" print(directory_name)\n",
|
| 100 |
-
" makedir(directory_name)"
|
| 101 |
-
]
|
| 102 |
-
},
|
| 103 |
-
{
|
| 104 |
-
"cell_type": "markdown",
|
| 105 |
-
"metadata": {},
|
| 106 |
-
"source": [
|
| 107 |
-
"## Let's make our Data Augmentation Functions\n",
|
| 108 |
-
"These are used to perform image manipulation and pre-processing tasks"
|
| 109 |
-
]
|
| 110 |
-
},
|
| 111 |
-
{
|
| 112 |
-
"cell_type": "code",
|
| 113 |
-
"execution_count": 4,
|
| 114 |
-
"metadata": {},
|
| 115 |
-
"outputs": [],
|
| 116 |
-
"source": [
|
| 117 |
-
"import cv2\n",
|
| 118 |
-
"import numpy as np \n",
|
| 119 |
-
"import random\n",
|
| 120 |
-
"import cv2\n",
|
| 121 |
-
"from scipy.ndimage import convolve\n",
|
| 122 |
-
"\n",
|
| 123 |
-
"def DigitAugmentation(frame, dim = 32):\n",
|
| 124 |
-
" \"\"\"Randomly alters the image using noise, pixelation and streching image functions\"\"\"\n",
|
| 125 |
-
" frame = cv2.resize(frame, None, fx=2, fy=2, interpolation = cv2.INTER_CUBIC)\n",
|
| 126 |
-
" frame = cv2.cvtColor(frame, cv2.COLOR_GRAY2RGB)\n",
|
| 127 |
-
" random_num = np.random.randint(0,9)\n",
|
| 128 |
-
"\n",
|
| 129 |
-
" if (random_num % 2 == 0):\n",
|
| 130 |
-
" frame = add_noise(frame)\n",
|
| 131 |
-
" if(random_num % 3 == 0):\n",
|
| 132 |
-
" frame = pixelate(frame)\n",
|
| 133 |
-
" if(random_num % 2 == 0):\n",
|
| 134 |
-
" frame = stretch(frame)\n",
|
| 135 |
-
" frame = cv2.resize(frame, (dim, dim), interpolation = cv2.INTER_AREA)\n",
|
| 136 |
-
"\n",
|
| 137 |
-
" return frame \n",
|
| 138 |
-
"\n",
|
| 139 |
-
"def add_noise(image):\n",
|
| 140 |
-
" \"\"\"Addings noise to image\"\"\"\n",
|
| 141 |
-
" prob = random.uniform(0.01, 0.05)\n",
|
| 142 |
-
" rnd = np.random.rand(image.shape[0], image.shape[1])\n",
|
| 143 |
-
" noisy = image.copy()\n",
|
| 144 |
-
" noisy[rnd < prob] = 0\n",
|
| 145 |
-
" noisy[rnd > 1 - prob] = 1\n",
|
| 146 |
-
" return noisy\n",
|
| 147 |
-
"\n",
|
| 148 |
-
"def pixelate(image):\n",
|
| 149 |
-
" \"Pixelates an image by reducing the resolution then upscaling it\"\n",
|
| 150 |
-
" dim = np.random.randint(8,12)\n",
|
| 151 |
-
" image = cv2.resize(image, (dim, dim), interpolation = cv2.INTER_AREA)\n",
|
| 152 |
-
" image = cv2.resize(image, (16, 16), interpolation = cv2.INTER_AREA)\n",
|
| 153 |
-
" return image\n",
|
| 154 |
-
"\n",
|
| 155 |
-
"def stretch(image):\n",
|
| 156 |
-
" \"Randomly applies different degrees of stretch to image\"\n",
|
| 157 |
-
" ran = np.random.randint(0,3)*2\n",
|
| 158 |
-
" if np.random.randint(0,2) == 0:\n",
|
| 159 |
-
" frame = cv2.resize(image, (32, ran+32), interpolation = cv2.INTER_AREA)\n",
|
| 160 |
-
" return frame[int(ran/2):int(ran+32)-int(ran/2), 0:32]\n",
|
| 161 |
-
" else:\n",
|
| 162 |
-
" frame = cv2.resize(image, (ran+32, 32), interpolation = cv2.INTER_AREA)\n",
|
| 163 |
-
" return frame[0:32, int(ran/2):int(ran+32)-int(ran/2)]\n",
|
| 164 |
-
" \n",
|
| 165 |
-
"def pre_process(image, inv = False):\n",
|
| 166 |
-
" \"\"\"Uses OTSU binarization on an image\"\"\"\n",
|
| 167 |
-
" try:\n",
|
| 168 |
-
" gray_image = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)\n",
|
| 169 |
-
" except:\n",
|
| 170 |
-
" gray_image = image\n",
|
| 171 |
-
" pass\n",
|
| 172 |
-
" \n",
|
| 173 |
-
" if inv == False:\n",
|
| 174 |
-
" _, th2 = cv2.threshold(gray_image, 0, 255, cv2.THRESH_BINARY + cv2.THRESH_OTSU)\n",
|
| 175 |
-
" else:\n",
|
| 176 |
-
" _, th2 = cv2.threshold(gray_image, 0, 255, cv2.THRESH_BINARY_INV + cv2.THRESH_OTSU)\n",
|
| 177 |
-
" resized = cv2.resize(th2, (32,32), interpolation = cv2.INTER_AREA)\n",
|
| 178 |
-
" return resized"
|
| 179 |
-
]
|
| 180 |
-
},
|
| 181 |
-
{
|
| 182 |
-
"cell_type": "markdown",
|
| 183 |
-
"metadata": {},
|
| 184 |
-
"source": [
|
| 185 |
-
"## Testing our augmentation functions"
|
| 186 |
-
]
|
| 187 |
-
},
|
| 188 |
-
{
|
| 189 |
-
"cell_type": "code",
|
| 190 |
-
"execution_count": 5,
|
| 191 |
-
"metadata": {},
|
| 192 |
-
"outputs": [],
|
| 193 |
-
"source": [
|
| 194 |
-
"cc1 = cv2.imread('creditcard_digits2.jpg', 0)\n",
|
| 195 |
-
"_, th2 = cv2.threshold(cc1, 0, 255, cv2.THRESH_BINARY_INV + cv2.THRESH_OTSU)\n",
|
| 196 |
-
"cv2.imshow(\"cc1\", th2)\n",
|
| 197 |
-
"cv2.waitKey(0)\n",
|
| 198 |
-
"cv2.destroyAllWindows()\n",
|
| 199 |
-
"\n",
|
| 200 |
-
"# This is the coordinates of the region enclosing the first digit\n",
|
| 201 |
-
"# This is preset and was done manually based on this specific image\n",
|
| 202 |
-
"region = [(0, 0), (35, 48)]\n",
|
| 203 |
-
"\n",
|
| 204 |
-
"# Assigns values to each region for ease of interpretation\n",
|
| 205 |
-
"top_left_y = region[0][1]\n",
|
| 206 |
-
"bottom_right_y = region[1][1]\n",
|
| 207 |
-
"top_left_x = region[0][0]\n",
|
| 208 |
-
"bottom_right_x = region[1][0]\n",
|
| 209 |
-
"\n",
|
| 210 |
-
"for i in range(0,1): #We only look at the first digit in testing out augmentation functions\n",
|
| 211 |
-
" roi = cc1[top_left_y:bottom_right_y, top_left_x:bottom_right_x]\n",
|
| 212 |
-
" for j in range(0,10):\n",
|
| 213 |
-
" roi2 = DigitAugmentation(roi)\n",
|
| 214 |
-
" roi_otsu = pre_process(roi2, inv = False)\n",
|
| 215 |
-
" cv2.imshow(\"otsu\", roi_otsu)\n",
|
| 216 |
-
" cv2.waitKey(0)\n",
|
| 217 |
-
" \n",
|
| 218 |
-
"cv2.destroyAllWindows()"
|
| 219 |
-
]
|
| 220 |
-
},
|
| 221 |
-
{
|
| 222 |
-
"cell_type": "markdown",
|
| 223 |
-
"metadata": {},
|
| 224 |
-
"source": [
|
| 225 |
-
"## Creating our Training Data (1000 variations of each font type)"
|
| 226 |
-
]
|
| 227 |
-
},
|
| 228 |
-
{
|
| 229 |
-
"cell_type": "code",
|
| 230 |
-
"execution_count": 6,
|
| 231 |
-
"metadata": {},
|
| 232 |
-
"outputs": [
|
| 233 |
-
{
|
| 234 |
-
"name": "stdout",
|
| 235 |
-
"output_type": "stream",
|
| 236 |
-
"text": [
|
| 237 |
-
"Augmenting Digit - 0\n",
|
| 238 |
-
"Augmenting Digit - 1\n",
|
| 239 |
-
"Augmenting Digit - 2\n",
|
| 240 |
-
"Augmenting Digit - 3\n",
|
| 241 |
-
"Augmenting Digit - 4\n",
|
| 242 |
-
"Augmenting Digit - 5\n",
|
| 243 |
-
"Augmenting Digit - 6\n",
|
| 244 |
-
"Augmenting Digit - 7\n",
|
| 245 |
-
"Augmenting Digit - 8\n",
|
| 246 |
-
"Augmenting Digit - 9\n"
|
| 247 |
-
]
|
| 248 |
-
}
|
| 249 |
-
],
|
| 250 |
-
"source": [
|
| 251 |
-
"# Creating 2000 Images for each digit in creditcard_digits1 - TRAINING DATA\n",
|
| 252 |
-
"\n",
|
| 253 |
-
"# Load our first image\n",
|
| 254 |
-
"cc1 = cv2.imread('creditcard_digits1.jpg', 0)\n",
|
| 255 |
-
"\n",
|
| 256 |
-
"_, th2 = cv2.threshold(cc1, 0, 255, cv2.THRESH_BINARY_INV + cv2.THRESH_OTSU)\n",
|
| 257 |
-
"cv2.imshow(\"cc1\", th2)\n",
|
| 258 |
-
"cv2.imshow(\"creditcard_digits1\", cc1)\n",
|
| 259 |
-
"cv2.waitKey(0)\n",
|
| 260 |
-
"cv2.destroyAllWindows()\n",
|
| 261 |
-
"\n",
|
| 262 |
-
"region = [(2, 19), (50, 72)]\n",
|
| 263 |
-
"\n",
|
| 264 |
-
"top_left_y = region[0][1]\n",
|
| 265 |
-
"bottom_right_y = region[1][1]\n",
|
| 266 |
-
"top_left_x = region[0][0]\n",
|
| 267 |
-
"bottom_right_x = region[1][0]\n",
|
| 268 |
-
"\n",
|
| 269 |
-
"for i in range(0,10): \n",
|
| 270 |
-
" # We jump the next digit each time we loop\n",
|
| 271 |
-
" if i > 0:\n",
|
| 272 |
-
" top_left_x = top_left_x + 59\n",
|
| 273 |
-
" bottom_right_x = bottom_right_x + 59\n",
|
| 274 |
-
"\n",
|
| 275 |
-
" roi = cc1[top_left_y:bottom_right_y, top_left_x:bottom_right_x]\n",
|
| 276 |
-
" print(\"Augmenting Digit - \", str(i))\n",
|
| 277 |
-
" # We create 200 versions of each image for our dataset\n",
|
| 278 |
-
" for j in range(0,2000):\n",
|
| 279 |
-
" roi2 = DigitAugmentation(roi)\n",
|
| 280 |
-
" roi_otsu = pre_process(roi2, inv = True)\n",
|
| 281 |
-
" cv2.imwrite(\"./credit_card/train/\"+str(i)+\"./_1_\"+str(j)+\".jpg\", roi_otsu)\n",
|
| 282 |
-
"cv2.destroyAllWindows()"
|
| 283 |
-
]
|
| 284 |
-
},
|
| 285 |
-
{
|
| 286 |
-
"cell_type": "code",
|
| 287 |
-
"execution_count": null,
|
| 288 |
-
"metadata": {},
|
| 289 |
-
"outputs": [],
|
| 290 |
-
"source": [
|
| 291 |
-
"# Creating 2000 Images for each digit in creditcard_digits2 - TRAINING DATA\n",
|
| 292 |
-
"\n",
|
| 293 |
-
"cc1 = cv2.imread('creditcard_digits2.jpg', 0)\n",
|
| 294 |
-
"_, th2 = cv2.threshold(cc1, 0, 255, cv2.THRESH_BINARY_INV + cv2.THRESH_OTSU)\n",
|
| 295 |
-
"cv2.imshow(\"cc1\", th2)\n",
|
| 296 |
-
"cv2.waitKey(0)\n",
|
| 297 |
-
"cv2.destroyAllWindows()\n",
|
| 298 |
-
"\n",
|
| 299 |
-
"region = [(0, 0), (35, 48)]\n",
|
| 300 |
-
"\n",
|
| 301 |
-
"top_left_y = region[0][1]\n",
|
| 302 |
-
"bottom_right_y = region[1][1]\n",
|
| 303 |
-
"top_left_x = region[0][0]\n",
|
| 304 |
-
"bottom_right_x = region[1][0]\n",
|
| 305 |
-
"\n",
|
| 306 |
-
"for i in range(0,10): \n",
|
| 307 |
-
" if i > 0:\n",
|
| 308 |
-
" # We jump the next digit each time we loop\n",
|
| 309 |
-
" top_left_x = top_left_x + 35\n",
|
| 310 |
-
" bottom_right_x = bottom_right_x + 35\n",
|
| 311 |
-
"\n",
|
| 312 |
-
" roi = cc1[top_left_y:bottom_right_y, top_left_x:bottom_right_x]\n",
|
| 313 |
-
" print(\"Augmenting Digit - \", str(i))\n",
|
| 314 |
-
" # We create 200 versions of each image for our dataset\n",
|
| 315 |
-
" for j in range(0,2000):\n",
|
| 316 |
-
" roi2 = DigitAugmentation(roi)\n",
|
| 317 |
-
" roi_otsu = pre_process(roi2, inv = False)\n",
|
| 318 |
-
" cv2.imwrite(\"./credit_card/train/\"+str(i)+\"./_2_\"+str(j)+\".jpg\", roi_otsu)\n",
|
| 319 |
-
"cv2.destroyAllWindows()"
|
| 320 |
-
]
|
| 321 |
-
},
|
| 322 |
-
{
|
| 323 |
-
"cell_type": "code",
|
| 324 |
-
"execution_count": 9,
|
| 325 |
-
"metadata": {},
|
| 326 |
-
"outputs": [
|
| 327 |
-
{
|
| 328 |
-
"name": "stdout",
|
| 329 |
-
"output_type": "stream",
|
| 330 |
-
"text": [
|
| 331 |
-
"Augmenting Digit - 0\n",
|
| 332 |
-
"Augmenting Digit - 1\n",
|
| 333 |
-
"Augmenting Digit - 2\n",
|
| 334 |
-
"Augmenting Digit - 3\n",
|
| 335 |
-
"Augmenting Digit - 4\n",
|
| 336 |
-
"Augmenting Digit - 5\n",
|
| 337 |
-
"Augmenting Digit - 6\n",
|
| 338 |
-
"Augmenting Digit - 7\n",
|
| 339 |
-
"Augmenting Digit - 8\n",
|
| 340 |
-
"Augmenting Digit - 9\n"
|
| 341 |
-
]
|
| 342 |
-
}
|
| 343 |
-
],
|
| 344 |
-
"source": [
|
| 345 |
-
"# Creating 200 Images for each digit in creditcard_digits1 - TEST DATA\n",
|
| 346 |
-
"\n",
|
| 347 |
-
"# Load our first image\n",
|
| 348 |
-
"cc1 = cv2.imread('creditcard_digits1.jpg', 0)\n",
|
| 349 |
-
"\n",
|
| 350 |
-
"_, th2 = cv2.threshold(cc1, 0, 255, cv2.THRESH_BINARY_INV + cv2.THRESH_OTSU)\n",
|
| 351 |
-
"cv2.imshow(\"cc1\", th2)\n",
|
| 352 |
-
"cv2.imshow(\"creditcard_digits1\", cc1)\n",
|
| 353 |
-
"cv2.waitKey(0)\n",
|
| 354 |
-
"cv2.destroyAllWindows()\n",
|
| 355 |
-
"\n",
|
| 356 |
-
"region = [(2, 19), (50, 72)]\n",
|
| 357 |
-
"\n",
|
| 358 |
-
"top_left_y = region[0][1]\n",
|
| 359 |
-
"bottom_right_y = region[1][1]\n",
|
| 360 |
-
"top_left_x = region[0][0]\n",
|
| 361 |
-
"bottom_right_x = region[1][0]\n",
|
| 362 |
-
"\n",
|
| 363 |
-
"for i in range(0,10): \n",
|
| 364 |
-
" # We jump the next digit each time we loop\n",
|
| 365 |
-
" if i > 0:\n",
|
| 366 |
-
" top_left_x = top_left_x + 59\n",
|
| 367 |
-
" bottom_right_x = bottom_right_x + 59\n",
|
| 368 |
-
"\n",
|
| 369 |
-
" roi = cc1[top_left_y:bottom_right_y, top_left_x:bottom_right_x]\n",
|
| 370 |
-
" print(\"Augmenting Digit -\", str(i))\n",
|
| 371 |
-
" # We create 200 versions of each image for our dataset\n",
|
| 372 |
-
" for j in range(0,2000):\n",
|
| 373 |
-
" roi2 = DigitAugmentation(roi)\n",
|
| 374 |
-
" roi_otsu = pre_process(roi2, inv = True)\n",
|
| 375 |
-
" cv2.imwrite(\"./credit_card/test/\"+str(i)+\"./_1_\"+str(j)+\".jpg\", roi_otsu)\n",
|
| 376 |
-
"cv2.destroyAllWindows()"
|
| 377 |
-
]
|
| 378 |
-
},
|
| 379 |
-
{
|
| 380 |
-
"cell_type": "code",
|
| 381 |
-
"execution_count": 12,
|
| 382 |
-
"metadata": {},
|
| 383 |
-
"outputs": [
|
| 384 |
-
{
|
| 385 |
-
"name": "stdout",
|
| 386 |
-
"output_type": "stream",
|
| 387 |
-
"text": [
|
| 388 |
-
"Augmenting Digit - 0\n",
|
| 389 |
-
"Augmenting Digit - 1\n",
|
| 390 |
-
"Augmenting Digit - 2\n",
|
| 391 |
-
"Augmenting Digit - 3\n",
|
| 392 |
-
"Augmenting Digit - 4\n",
|
| 393 |
-
"Augmenting Digit - 5\n",
|
| 394 |
-
"Augmenting Digit - 6\n",
|
| 395 |
-
"Augmenting Digit - 7\n",
|
| 396 |
-
"Augmenting Digit - 8\n",
|
| 397 |
-
"Augmenting Digit - 9\n"
|
| 398 |
-
]
|
| 399 |
-
}
|
| 400 |
-
],
|
| 401 |
-
"source": [
|
| 402 |
-
"# Creating 200 Images for each digit in creditcard_digits2 - TEST DATA\n",
|
| 403 |
-
"\n",
|
| 404 |
-
"cc1 = cv2.imread('creditcard_digits2.jpg', 0)\n",
|
| 405 |
-
"_, th2 = cv2.threshold(cc1, 0, 255, cv2.THRESH_BINARY_INV + cv2.THRESH_OTSU)\n",
|
| 406 |
-
"cv2.imshow(\"cc1\", th2)\n",
|
| 407 |
-
"cv2.waitKey(0)\n",
|
| 408 |
-
"cv2.destroyAllWindows()\n",
|
| 409 |
-
"\n",
|
| 410 |
-
"region = [(0, 0), (35, 48)]\n",
|
| 411 |
-
"\n",
|
| 412 |
-
"top_left_y = region[0][1]\n",
|
| 413 |
-
"bottom_right_y = region[1][1]\n",
|
| 414 |
-
"top_left_x = region[0][0]\n",
|
| 415 |
-
"bottom_right_x = region[1][0]\n",
|
| 416 |
-
"\n",
|
| 417 |
-
"for i in range(0,10): \n",
|
| 418 |
-
" if i > 0:\n",
|
| 419 |
-
" # We jump the next digit each time we loop\n",
|
| 420 |
-
" top_left_x = top_left_x + 35\n",
|
| 421 |
-
" bottom_right_x = bottom_right_x + 35\n",
|
| 422 |
-
"\n",
|
| 423 |
-
" roi = cc1[top_left_y:bottom_right_y, top_left_x:bottom_right_x]\n",
|
| 424 |
-
" print(\"Augmenting Digit - \", str(i))\n",
|
| 425 |
-
" # We create 200 versions of each image for our dataset\n",
|
| 426 |
-
" for j in range(0,2000):\n",
|
| 427 |
-
" roi2 = DigitAugmentation(roi)\n",
|
| 428 |
-
" roi_otsu = pre_process(roi2, inv = False)\n",
|
| 429 |
-
" cv2.imwrite(\"./credit_card/test/\"+str(i)+\"./_2_\"+str(j)+\".jpg\", roi_otsu)\n",
|
| 430 |
-
" #cv2.imshow(\"otsu\", roi_otsu)\n",
|
| 431 |
-
" #print(\"-\")\n",
|
| 432 |
-
" #cv2.waitKey(0)\n",
|
| 433 |
-
"cv2.destroyAllWindows()"
|
| 434 |
-
]
|
| 435 |
-
},
|
| 436 |
-
{
|
| 437 |
-
"cell_type": "markdown",
|
| 438 |
-
"metadata": {},
|
| 439 |
-
"source": [
|
| 440 |
-
"# 2. Creating our Classifier"
|
| 441 |
-
]
|
| 442 |
-
},
|
| 443 |
-
{
|
| 444 |
-
"cell_type": "code",
|
| 445 |
-
"execution_count": 13,
|
| 446 |
-
"metadata": {},
|
| 447 |
-
"outputs": [
|
| 448 |
-
{
|
| 449 |
-
"name": "stdout",
|
| 450 |
-
"output_type": "stream",
|
| 451 |
-
"text": [
|
| 452 |
-
"Found 20254 images belonging to 10 classes.\n",
|
| 453 |
-
"Found 40000 images belonging to 10 classes.\n"
|
| 454 |
-
]
|
| 455 |
-
}
|
| 456 |
-
],
|
| 457 |
-
"source": [
|
| 458 |
-
"import os\n",
|
| 459 |
-
"import numpy as np\n",
|
| 460 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 461 |
-
"from tensorflow.keras.layers import Activation, Dropout, Flatten, Dense\n",
|
| 462 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 463 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D, ZeroPadding2D\n",
|
| 464 |
-
"from tensorflow.keras import optimizers\n",
|
| 465 |
-
"import tensorflow as tf\n",
|
| 466 |
-
"\n",
|
| 467 |
-
"input_shape = (32, 32, 3)\n",
|
| 468 |
-
"img_width = 32\n",
|
| 469 |
-
"img_height = 32\n",
|
| 470 |
-
"num_classes = 10\n",
|
| 471 |
-
"nb_train_samples = 10000\n",
|
| 472 |
-
"nb_validation_samples = 2000\n",
|
| 473 |
-
"batch_size = 16\n",
|
| 474 |
-
"epochs = 1\n",
|
| 475 |
-
"\n",
|
| 476 |
-
"train_data_dir = './credit_card/train'\n",
|
| 477 |
-
"validation_data_dir = './credit_card/test'\n",
|
| 478 |
-
"\n",
|
| 479 |
-
"# Creating our data generator for our test data\n",
|
| 480 |
-
"validation_datagen = ImageDataGenerator(\n",
|
| 481 |
-
" # used to rescale the pixel values from [0, 255] to [0, 1] interval\n",
|
| 482 |
-
" rescale = 1./255)\n",
|
| 483 |
-
"\n",
|
| 484 |
-
"# Creating our data generator for our training data\n",
|
| 485 |
-
"train_datagen = ImageDataGenerator(\n",
|
| 486 |
-
" rescale = 1./255, # normalize pixel values to [0,1]\n",
|
| 487 |
-
" rotation_range = 10, # randomly applies rotations\n",
|
| 488 |
-
" width_shift_range = 0.25, # randomly applies width shifting\n",
|
| 489 |
-
" height_shift_range = 0.25, # randomly applies height shifting\n",
|
| 490 |
-
" shear_range=0.5,\n",
|
| 491 |
-
" zoom_range=0.5,\n",
|
| 492 |
-
" horizontal_flip = False, # randonly flips the image\n",
|
| 493 |
-
" fill_mode = 'nearest') # uses the fill mode nearest to fill gaps created by the above\n",
|
| 494 |
-
"\n",
|
| 495 |
-
"# Specify criteria about our training data, such as the directory, image size, batch size and type \n",
|
| 496 |
-
"# automagically retrieve images and their classes for train and validation sets\n",
|
| 497 |
-
"train_generator = train_datagen.flow_from_directory(\n",
|
| 498 |
-
" train_data_dir,\n",
|
| 499 |
-
" target_size = (img_width, img_height),\n",
|
| 500 |
-
" batch_size = batch_size,\n",
|
| 501 |
-
" class_mode = 'categorical')\n",
|
| 502 |
-
"\n",
|
| 503 |
-
"validation_generator = validation_datagen.flow_from_directory(\n",
|
| 504 |
-
" validation_data_dir,\n",
|
| 505 |
-
" target_size = (img_width, img_height),\n",
|
| 506 |
-
" batch_size = batch_size,\n",
|
| 507 |
-
" class_mode = 'categorical',\n",
|
| 508 |
-
" shuffle = False) "
|
| 509 |
-
]
|
| 510 |
-
},
|
| 511 |
-
{
|
| 512 |
-
"cell_type": "markdown",
|
| 513 |
-
"metadata": {},
|
| 514 |
-
"source": [
|
| 515 |
-
"## Creating out Model based on the LeNet CNN Architecture"
|
| 516 |
-
]
|
| 517 |
-
},
|
| 518 |
-
{
|
| 519 |
-
"cell_type": "code",
|
| 520 |
-
"execution_count": 15,
|
| 521 |
-
"metadata": {},
|
| 522 |
-
"outputs": [
|
| 523 |
-
{
|
| 524 |
-
"name": "stdout",
|
| 525 |
-
"output_type": "stream",
|
| 526 |
-
"text": [
|
| 527 |
-
"Model: \"sequential_1\"\n",
|
| 528 |
-
"_________________________________________________________________\n",
|
| 529 |
-
"Layer (type) Output Shape Param # \n",
|
| 530 |
-
"=================================================================\n",
|
| 531 |
-
"conv2d_2 (Conv2D) (None, 32, 32, 20) 1520 \n",
|
| 532 |
-
"_________________________________________________________________\n",
|
| 533 |
-
"activation_4 (Activation) (None, 32, 32, 20) 0 \n",
|
| 534 |
-
"_________________________________________________________________\n",
|
| 535 |
-
"max_pooling2d_2 (MaxPooling2 (None, 16, 16, 20) 0 \n",
|
| 536 |
-
"_________________________________________________________________\n",
|
| 537 |
-
"conv2d_3 (Conv2D) (None, 16, 16, 50) 25050 \n",
|
| 538 |
-
"_________________________________________________________________\n",
|
| 539 |
-
"activation_5 (Activation) (None, 16, 16, 50) 0 \n",
|
| 540 |
-
"_________________________________________________________________\n",
|
| 541 |
-
"max_pooling2d_3 (MaxPooling2 (None, 8, 8, 50) 0 \n",
|
| 542 |
-
"_________________________________________________________________\n",
|
| 543 |
-
"flatten_1 (Flatten) (None, 3200) 0 \n",
|
| 544 |
-
"_________________________________________________________________\n",
|
| 545 |
-
"dense_2 (Dense) (None, 500) 1600500 \n",
|
| 546 |
-
"_________________________________________________________________\n",
|
| 547 |
-
"activation_6 (Activation) (None, 500) 0 \n",
|
| 548 |
-
"_________________________________________________________________\n",
|
| 549 |
-
"dense_3 (Dense) (None, 10) 5010 \n",
|
| 550 |
-
"_________________________________________________________________\n",
|
| 551 |
-
"activation_7 (Activation) (None, 10) 0 \n",
|
| 552 |
-
"=================================================================\n",
|
| 553 |
-
"Total params: 1,632,080\n",
|
| 554 |
-
"Trainable params: 1,632,080\n",
|
| 555 |
-
"Non-trainable params: 0\n",
|
| 556 |
-
"_________________________________________________________________\n",
|
| 557 |
-
"None\n"
|
| 558 |
-
]
|
| 559 |
-
}
|
| 560 |
-
],
|
| 561 |
-
"source": [
|
| 562 |
-
"# create model\n",
|
| 563 |
-
"model = Sequential()\n",
|
| 564 |
-
"\n",
|
| 565 |
-
"# 2 sets of CRP (Convolution, RELU, Pooling)\n",
|
| 566 |
-
"model.add(Conv2D(20, (5, 5),\n",
|
| 567 |
-
" padding = \"same\", \n",
|
| 568 |
-
" input_shape = input_shape))\n",
|
| 569 |
-
"model.add(Activation(\"relu\"))\n",
|
| 570 |
-
"model.add(MaxPooling2D(pool_size = (2, 2), strides = (2, 2)))\n",
|
| 571 |
-
"\n",
|
| 572 |
-
"model.add(Conv2D(50, (5, 5),\n",
|
| 573 |
-
" padding = \"same\"))\n",
|
| 574 |
-
"model.add(Activation(\"relu\"))\n",
|
| 575 |
-
"model.add(MaxPooling2D(pool_size = (2, 2), strides = (2, 2)))\n",
|
| 576 |
-
"\n",
|
| 577 |
-
"# Fully connected layers (w/ RELU)\n",
|
| 578 |
-
"model.add(Flatten())\n",
|
| 579 |
-
"model.add(Dense(500))\n",
|
| 580 |
-
"model.add(Activation(\"relu\"))\n",
|
| 581 |
-
"\n",
|
| 582 |
-
"# Softmax (for classification)\n",
|
| 583 |
-
"model.add(Dense(num_classes))\n",
|
| 584 |
-
"model.add(Activation(\"softmax\"))\n",
|
| 585 |
-
" \n",
|
| 586 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 587 |
-
" optimizer = tf.keras.optimizers.Adadelta(),\n",
|
| 588 |
-
" metrics = ['accuracy'])\n",
|
| 589 |
-
" \n",
|
| 590 |
-
"print(model.summary())"
|
| 591 |
-
]
|
| 592 |
-
},
|
| 593 |
-
{
|
| 594 |
-
"cell_type": "markdown",
|
| 595 |
-
"metadata": {},
|
| 596 |
-
"source": [
|
| 597 |
-
"## Training our Model"
|
| 598 |
-
]
|
| 599 |
-
},
|
| 600 |
-
{
|
| 601 |
-
"cell_type": "code",
|
| 602 |
-
"execution_count": 17,
|
| 603 |
-
"metadata": {},
|
| 604 |
-
"outputs": [
|
| 605 |
-
{
|
| 606 |
-
"name": "stdout",
|
| 607 |
-
"output_type": "stream",
|
| 608 |
-
"text": [
|
| 609 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 610 |
-
" ...\n",
|
| 611 |
-
" to \n",
|
| 612 |
-
" ['...']\n",
|
| 613 |
-
"WARNING:tensorflow:sample_weight modes were coerced from\n",
|
| 614 |
-
" ...\n",
|
| 615 |
-
" to \n",
|
| 616 |
-
" ['...']\n",
|
| 617 |
-
"Train for 1250 steps, validate for 250 steps\n",
|
| 618 |
-
"1249/1250 [============================>.] - ETA: 0s - loss: 0.2923 - accuracy: 0.9020\n",
|
| 619 |
-
"Epoch 00001: val_loss improved from inf to 0.00030, saving model to creditcard.h5\n",
|
| 620 |
-
"1250/1250 [==============================] - 150s 120ms/step - loss: 0.2922 - accuracy: 0.9020 - val_loss: 3.0485e-04 - val_accuracy: 1.0000\n"
|
| 621 |
-
]
|
| 622 |
-
}
|
| 623 |
-
],
|
| 624 |
-
"source": [
|
| 625 |
-
"from tensorflow.keras.optimizers import RMSprop\n",
|
| 626 |
-
"from tensorflow.keras.callbacks import ModelCheckpoint, EarlyStopping\n",
|
| 627 |
-
" \n",
|
| 628 |
-
"checkpoint = ModelCheckpoint(\"creditcard.h5\",\n",
|
| 629 |
-
" monitor=\"val_loss\",\n",
|
| 630 |
-
" mode=\"min\",\n",
|
| 631 |
-
" save_best_only = True,\n",
|
| 632 |
-
" verbose=1)\n",
|
| 633 |
-
"\n",
|
| 634 |
-
"earlystop = EarlyStopping(monitor = 'val_loss', \n",
|
| 635 |
-
" min_delta = 0, \n",
|
| 636 |
-
" patience = 3,\n",
|
| 637 |
-
" verbose = 1,\n",
|
| 638 |
-
" restore_best_weights = True)\n",
|
| 639 |
-
"\n",
|
| 640 |
-
"# we put our call backs into a callback list\n",
|
| 641 |
-
"callbacks = [earlystop, checkpoint]\n",
|
| 642 |
-
"\n",
|
| 643 |
-
"# Note we use a very small learning rate \n",
|
| 644 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 645 |
-
" optimizer = RMSprop(lr = 0.001),\n",
|
| 646 |
-
" metrics = ['accuracy'])\n",
|
| 647 |
-
"\n",
|
| 648 |
-
"nb_train_samples = 20000\n",
|
| 649 |
-
"nb_validation_samples = 4000\n",
|
| 650 |
-
"epochs = 1\n",
|
| 651 |
-
"batch_size = 16\n",
|
| 652 |
-
"\n",
|
| 653 |
-
"history = model.fit_generator(\n",
|
| 654 |
-
" train_generator,\n",
|
| 655 |
-
" steps_per_epoch = nb_train_samples // batch_size,\n",
|
| 656 |
-
" epochs = epochs,\n",
|
| 657 |
-
" callbacks = callbacks,\n",
|
| 658 |
-
" validation_data = validation_generator,\n",
|
| 659 |
-
" validation_steps = nb_validation_samples // batch_size)\n",
|
| 660 |
-
"\n",
|
| 661 |
-
"model.save(\"creditcard.h5\")"
|
| 662 |
-
]
|
| 663 |
-
},
|
| 664 |
-
{
|
| 665 |
-
"cell_type": "markdown",
|
| 666 |
-
"metadata": {},
|
| 667 |
-
"source": [
|
| 668 |
-
"# 3. Extract a Credit Card from the backgroud\n",
|
| 669 |
-
"#### NOTE:\n",
|
| 670 |
-
"You may need to install imutils \n",
|
| 671 |
-
"run *pip install imutils* in terminal and restart your kernal to install"
|
| 672 |
-
]
|
| 673 |
-
},
|
| 674 |
-
{
|
| 675 |
-
"cell_type": "code",
|
| 676 |
-
"execution_count": 20,
|
| 677 |
-
"metadata": {},
|
| 678 |
-
"outputs": [
|
| 679 |
-
{
|
| 680 |
-
"name": "stdout",
|
| 681 |
-
"output_type": "stream",
|
| 682 |
-
"text": [
|
| 683 |
-
"Collecting scikit-image\n",
|
| 684 |
-
" Downloading scikit_image-0.17.2-cp37-cp37m-win_amd64.whl (11.5 MB)\n",
|
| 685 |
-
"Requirement already satisfied, skipping upgrade: numpy>=1.15.1 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from scikit-image) (1.16.5)\n",
|
| 686 |
-
"Requirement already satisfied, skipping upgrade: networkx>=2.0 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from scikit-image) (2.3)\n",
|
| 687 |
-
"Requirement already satisfied, skipping upgrade: pillow!=7.1.0,!=7.1.1,>=4.3.0 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from scikit-image) (6.2.0)\n",
|
| 688 |
-
"Requirement already satisfied, skipping upgrade: scipy>=1.0.1 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from scikit-image) (1.4.1)\n",
|
| 689 |
-
"Requirement already satisfied, skipping upgrade: matplotlib!=3.0.0,>=2.0.0 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from scikit-image) (3.1.1)\n",
|
| 690 |
-
"Collecting tifffile>=2019.7.26\n",
|
| 691 |
-
" Downloading tifffile-2020.6.3-py3-none-any.whl (133 kB)\n",
|
| 692 |
-
"Requirement already satisfied, skipping upgrade: imageio>=2.3.0 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from scikit-image) (2.6.0)\n",
|
| 693 |
-
"Collecting PyWavelets>=1.1.1\n",
|
| 694 |
-
" Downloading PyWavelets-1.1.1-cp37-cp37m-win_amd64.whl (4.2 MB)\n",
|
| 695 |
-
"Requirement already satisfied, skipping upgrade: decorator>=4.3.0 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from networkx>=2.0->scikit-image) (4.4.0)\n",
|
| 696 |
-
"Requirement already satisfied, skipping upgrade: cycler>=0.10 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image) (0.10.0)\n",
|
| 697 |
-
"Requirement already satisfied, skipping upgrade: kiwisolver>=1.0.1 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image) (1.1.0)\n",
|
| 698 |
-
"Requirement already satisfied, skipping upgrade: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image) (2.4.2)\n",
|
| 699 |
-
"Requirement already satisfied, skipping upgrade: python-dateutil>=2.1 in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image) (2.8.0)\n",
|
| 700 |
-
"Requirement already satisfied, skipping upgrade: six in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from cycler>=0.10->matplotlib!=3.0.0,>=2.0.0->scikit-image) (1.12.0)\n",
|
| 701 |
-
"Requirement already satisfied, skipping upgrade: setuptools in c:\\programdata\\anaconda3\\envs\\cv\\lib\\site-packages (from kiwisolver>=1.0.1->matplotlib!=3.0.0,>=2.0.0->scikit-image) (41.4.0)\n",
|
| 702 |
-
"Installing collected packages: tifffile, PyWavelets, scikit-image\n",
|
| 703 |
-
" Attempting uninstall: PyWavelets\n",
|
| 704 |
-
" Found existing installation: PyWavelets 1.0.3\n",
|
| 705 |
-
" Uninstalling PyWavelets-1.0.3:\n",
|
| 706 |
-
" Successfully uninstalled PyWavelets-1.0.3\n"
|
| 707 |
-
]
|
| 708 |
-
},
|
| 709 |
-
{
|
| 710 |
-
"name": "stderr",
|
| 711 |
-
"output_type": "stream",
|
| 712 |
-
"text": [
|
| 713 |
-
" WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<pip._vendor.urllib3.connection.VerifiedHTTPSConnection object at 0x000002C6B8509088>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed')': /packages/6e/c7/6411a4ce983bf06db8c3b8093b04b268c2580816f61156a5848e24e97118/scikit_image-0.17.2-cp37-cp37m-win_amd64.whl\n",
|
| 714 |
-
"ERROR: keras-vis 0.4.1 requires keras, which is not installed.\n",
|
| 715 |
-
"ERROR: Could not install packages due to an EnvironmentError: [WinError 5] Access is denied: 'c:\\\\programdata\\\\anaconda3\\\\envs\\\\cv\\\\lib\\\\site-packages\\\\~ywt\\\\_extensions\\\\_cwt.cp37-win_amd64.pyd'\n",
|
| 716 |
-
"Consider using the `--user` option or check the permissions.\n",
|
| 717 |
-
"\n"
|
| 718 |
-
]
|
| 719 |
-
}
|
| 720 |
-
],
|
| 721 |
-
"source": [
|
| 722 |
-
"!pip install -U scikit-image"
|
| 723 |
-
]
|
| 724 |
-
},
|
| 725 |
-
{
|
| 726 |
-
"cell_type": "code",
|
| 727 |
-
"execution_count": 24,
|
| 728 |
-
"metadata": {},
|
| 729 |
-
"outputs": [
|
| 730 |
-
{
|
| 731 |
-
"ename": "ImportError",
|
| 732 |
-
"evalue": "cannot import name 'filter' from 'skimage' (C:\\ProgramData\\Anaconda3\\envs\\cv\\lib\\site-packages\\skimage\\__init__.py)",
|
| 733 |
-
"output_type": "error",
|
| 734 |
-
"traceback": [
|
| 735 |
-
"\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
|
| 736 |
-
"\u001b[1;31mImportError\u001b[0m Traceback (most recent call last)",
|
| 737 |
-
"\u001b[1;32m<ipython-input-24-83690ff69298>\u001b[0m in \u001b[0;36m<module>\u001b[1;34m\u001b[0m\n\u001b[0;32m 3\u001b[0m \u001b[1;32mimport\u001b[0m \u001b[0mimutils\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 4\u001b[0m \u001b[1;31m#from skimage.filters import threshold_adaptive\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 5\u001b[1;33m \u001b[1;32mfrom\u001b[0m \u001b[0mskimage\u001b[0m \u001b[1;32mimport\u001b[0m \u001b[0mfilter\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 6\u001b[0m \u001b[1;32mimport\u001b[0m \u001b[0mos\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 7\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n",
|
| 738 |
-
"\u001b[1;31mImportError\u001b[0m: cannot import name 'filter' from 'skimage' (C:\\ProgramData\\Anaconda3\\envs\\cv\\lib\\site-packages\\skimage\\__init__.py)"
|
| 739 |
-
]
|
| 740 |
-
}
|
| 741 |
-
],
|
| 742 |
-
"source": [
|
| 743 |
-
"import cv2\n",
|
| 744 |
-
"import numpy as np\n",
|
| 745 |
-
"import imutils\n",
|
| 746 |
-
"#from skimage.filters import threshold_adaptive\n",
|
| 747 |
-
"from skimage import filter\n",
|
| 748 |
-
"import os\n",
|
| 749 |
-
"\n",
|
| 750 |
-
"def order_points(pts):\n",
|
| 751 |
-
" # initialzie a list of coordinates that will be ordered\n",
|
| 752 |
-
" # such that the first entry in the list is the top-left,\n",
|
| 753 |
-
" # the second entry is the top-right, the third is the\n",
|
| 754 |
-
" # bottom-right, and the fourth is the bottom-left\n",
|
| 755 |
-
" rect = np.zeros((4, 2), dtype = \"float32\")\n",
|
| 756 |
-
"\n",
|
| 757 |
-
" # the top-left point will have the smallest sum, whereas\n",
|
| 758 |
-
" # the bottom-right point will have the largest sum\n",
|
| 759 |
-
" s = pts.sum(axis = 1)\n",
|
| 760 |
-
" rect[0] = pts[np.argmin(s)]\n",
|
| 761 |
-
" rect[2] = pts[np.argmax(s)]\n",
|
| 762 |
-
"\n",
|
| 763 |
-
" # now, compute the difference between the points, the\n",
|
| 764 |
-
" # top-right point will have the smallest difference,\n",
|
| 765 |
-
" # whereas the bottom-left will have the largest difference\n",
|
| 766 |
-
" diff = np.diff(pts, axis = 1)\n",
|
| 767 |
-
" rect[1] = pts[np.argmin(diff)]\n",
|
| 768 |
-
" rect[3] = pts[np.argmax(diff)]\n",
|
| 769 |
-
"\n",
|
| 770 |
-
" # return the ordered coordinates\n",
|
| 771 |
-
" return rect\n",
|
| 772 |
-
"\n",
|
| 773 |
-
"def four_point_transform(image, pts):\n",
|
| 774 |
-
" # obtain a consistent order of the points and unpack them\n",
|
| 775 |
-
" # individually\n",
|
| 776 |
-
" rect = order_points(pts)\n",
|
| 777 |
-
" (tl, tr, br, bl) = rect\n",
|
| 778 |
-
"\n",
|
| 779 |
-
" # compute the width of the new image, which will be the\n",
|
| 780 |
-
" # maximum distance between bottom-right and bottom-left\n",
|
| 781 |
-
" # x-coordiates or the top-right and top-left x-coordinates\n",
|
| 782 |
-
" widthA = np.sqrt(((br[0] - bl[0]) ** 2) + ((br[1] - bl[1]) ** 2))\n",
|
| 783 |
-
" widthB = np.sqrt(((tr[0] - tl[0]) ** 2) + ((tr[1] - tl[1]) ** 2))\n",
|
| 784 |
-
" maxWidth = max(int(widthA), int(widthB))\n",
|
| 785 |
-
"\n",
|
| 786 |
-
" # compute the height of the new image, which will be the\n",
|
| 787 |
-
" # maximum distance between the top-right and bottom-right\n",
|
| 788 |
-
" # y-coordinates or the top-left and bottom-left y-coordinates\n",
|
| 789 |
-
" heightA = np.sqrt(((tr[0] - br[0]) ** 2) + ((tr[1] - br[1]) ** 2))\n",
|
| 790 |
-
" heightB = np.sqrt(((tl[0] - bl[0]) ** 2) + ((tl[1] - bl[1]) ** 2))\n",
|
| 791 |
-
" maxHeight = max(int(heightA), int(heightB))\n",
|
| 792 |
-
"\n",
|
| 793 |
-
" # now that we have the dimensions of the new image, construct\n",
|
| 794 |
-
" # the set of destination points to obtain a \"birds eye view\",\n",
|
| 795 |
-
" # (i.e. top-down view) of the image, again specifying points\n",
|
| 796 |
-
" # in the top-left, top-right, bottom-right, and bottom-left\n",
|
| 797 |
-
" # order\n",
|
| 798 |
-
" dst = np.array([\n",
|
| 799 |
-
" [0, 0],\n",
|
| 800 |
-
" [maxWidth - 1, 0],\n",
|
| 801 |
-
" [maxWidth - 1, maxHeight - 1],\n",
|
| 802 |
-
" [0, maxHeight - 1]], dtype = \"float32\")\n",
|
| 803 |
-
"\n",
|
| 804 |
-
" # compute the perspective transform matrix and then apply it\n",
|
| 805 |
-
" M = cv2.getPerspectiveTransform(rect, dst)\n",
|
| 806 |
-
" warped = cv2.warpPerspective(image, M, (maxWidth, maxHeight))\n",
|
| 807 |
-
"\n",
|
| 808 |
-
" # return the warped image\n",
|
| 809 |
-
" return warped\n",
|
| 810 |
-
"\n",
|
| 811 |
-
"def doc_Scan(image):\n",
|
| 812 |
-
" orig_height, orig_width = image.shape[:2]\n",
|
| 813 |
-
" ratio = image.shape[0] / 500.0\n",
|
| 814 |
-
"\n",
|
| 815 |
-
" orig = image.copy()\n",
|
| 816 |
-
" image = imutils.resize(image, height = 500)\n",
|
| 817 |
-
" orig_height, orig_width = image.shape[:2]\n",
|
| 818 |
-
" Original_Area = orig_height * orig_width\n",
|
| 819 |
-
" \n",
|
| 820 |
-
" # convert the image to grayscale, blur it, and find edges\n",
|
| 821 |
-
" # in the image\n",
|
| 822 |
-
" gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)\n",
|
| 823 |
-
" gray = cv2.GaussianBlur(gray, (5, 5), 0)\n",
|
| 824 |
-
" edged = cv2.Canny(gray, 75, 200)\n",
|
| 825 |
-
"\n",
|
| 826 |
-
" cv2.imshow(\"Image\", image)\n",
|
| 827 |
-
" cv2.imshow(\"Edged\", edged)\n",
|
| 828 |
-
" cv2.waitKey(0)\n",
|
| 829 |
-
" # show the original image and the edge detected image\n",
|
| 830 |
-
"\n",
|
| 831 |
-
" # find the contours in the edged image, keeping only the\n",
|
| 832 |
-
" # largest ones, and initialize the screen contour\n",
|
| 833 |
-
" _, contours, hierarchy = cv2.findContours(edged.copy(), cv2.RETR_LIST, cv2.CHAIN_APPROX_SIMPLE)\n",
|
| 834 |
-
" contours = sorted(contours, key = cv2.contourArea, reverse = True)[:5]\n",
|
| 835 |
-
" \n",
|
| 836 |
-
" # loop over the contours\n",
|
| 837 |
-
" for c in contours:\n",
|
| 838 |
-
"\n",
|
| 839 |
-
" # approximate the contour\n",
|
| 840 |
-
" area = cv2.contourArea(c)\n",
|
| 841 |
-
" if area < (Original_Area/3):\n",
|
| 842 |
-
" print(\"Error Image Invalid\")\n",
|
| 843 |
-
" return(\"ERROR\")\n",
|
| 844 |
-
" peri = cv2.arcLength(c, True)\n",
|
| 845 |
-
" approx = cv2.approxPolyDP(c, 0.02 * peri, True)\n",
|
| 846 |
-
"\n",
|
| 847 |
-
" # if our approximated contour has four points, then we\n",
|
| 848 |
-
" # can assume that we have found our screen\n",
|
| 849 |
-
" if len(approx) == 4:\n",
|
| 850 |
-
" screenCnt = approx\n",
|
| 851 |
-
" break\n",
|
| 852 |
-
"\n",
|
| 853 |
-
" # show the contour (outline) of the piece of paper\n",
|
| 854 |
-
" cv2.drawContours(image, [screenCnt], -1, (0, 255, 0), 2)\n",
|
| 855 |
-
" cv2.imshow(\"Outline\", image)\n",
|
| 856 |
-
"\n",
|
| 857 |
-
" warped = four_point_transform(orig, screenCnt.reshape(4, 2) * ratio)\n",
|
| 858 |
-
" # convert the warped image to grayscale, then threshold it\n",
|
| 859 |
-
" # to give it that 'black and white' paper effect\n",
|
| 860 |
-
" cv2.resize(warped, (640,403), interpolation = cv2.INTER_AREA)\n",
|
| 861 |
-
" cv2.imwrite(\"credit_card_color.jpg\", warped)\n",
|
| 862 |
-
" warped = cv2.cvtColor(warped, cv2.COLOR_BGR2GRAY)\n",
|
| 863 |
-
" warped = warped.astype(\"uint8\") * 255\n",
|
| 864 |
-
" cv2.imshow(\"Extracted Credit Card\", warped)\n",
|
| 865 |
-
" cv2.waitKey(0)\n",
|
| 866 |
-
" cv2.destroyAllWindows()\n",
|
| 867 |
-
" return warped"
|
| 868 |
-
]
|
| 869 |
-
},
|
| 870 |
-
{
|
| 871 |
-
"cell_type": "code",
|
| 872 |
-
"execution_count": 30,
|
| 873 |
-
"metadata": {},
|
| 874 |
-
"outputs": [],
|
| 875 |
-
"source": [
|
| 876 |
-
"cv2.destroyAllWindows()"
|
| 877 |
-
]
|
| 878 |
-
},
|
| 879 |
-
{
|
| 880 |
-
"cell_type": "markdown",
|
| 881 |
-
"metadata": {},
|
| 882 |
-
"source": [
|
| 883 |
-
"## Extract our Credit Card and the Region of Interest (ROI)"
|
| 884 |
-
]
|
| 885 |
-
},
|
| 886 |
-
{
|
| 887 |
-
"cell_type": "code",
|
| 888 |
-
"execution_count": 2,
|
| 889 |
-
"metadata": {},
|
| 890 |
-
"outputs": [],
|
| 891 |
-
"source": [
|
| 892 |
-
"image = cv2.imread('test_card.jpg')\n",
|
| 893 |
-
"image = doc_Scan(image)\n",
|
| 894 |
-
"\n",
|
| 895 |
-
"region = [(55, 210), (640, 290)]\n",
|
| 896 |
-
"\n",
|
| 897 |
-
"top_left_y = region[0][1]\n",
|
| 898 |
-
"bottom_right_y = region[1][1]\n",
|
| 899 |
-
"top_left_x = region[0][0]\n",
|
| 900 |
-
"bottom_right_x = region[1][0]\n",
|
| 901 |
-
"\n",
|
| 902 |
-
"# Extracting the area were the credit numbers are located\n",
|
| 903 |
-
"roi = image[top_left_y:bottom_right_y, top_left_x:bottom_right_x]\n",
|
| 904 |
-
"cv2.imshow(\"Region\", roi)\n",
|
| 905 |
-
"cv2.imwrite(\"credit_card_extracted_digits.jpg\", roi)\n",
|
| 906 |
-
"cv2.waitKey(0)\n",
|
| 907 |
-
"cv2.destroyAllWindows()"
|
| 908 |
-
]
|
| 909 |
-
},
|
| 910 |
-
{
|
| 911 |
-
"cell_type": "markdown",
|
| 912 |
-
"metadata": {},
|
| 913 |
-
"source": [
|
| 914 |
-
"## Loading our trained model"
|
| 915 |
-
]
|
| 916 |
-
},
|
| 917 |
-
{
|
| 918 |
-
"cell_type": "code",
|
| 919 |
-
"execution_count": 3,
|
| 920 |
-
"metadata": {},
|
| 921 |
-
"outputs": [
|
| 922 |
-
{
|
| 923 |
-
"name": "stderr",
|
| 924 |
-
"output_type": "stream",
|
| 925 |
-
"text": [
|
| 926 |
-
"Using TensorFlow backend.\n"
|
| 927 |
-
]
|
| 928 |
-
}
|
| 929 |
-
],
|
| 930 |
-
"source": [
|
| 931 |
-
"from tensorflow.keras.models import load_model\n",
|
| 932 |
-
"import keras\n",
|
| 933 |
-
"\n",
|
| 934 |
-
"classifier = load_model('creditcard.h5')"
|
| 935 |
-
]
|
| 936 |
-
},
|
| 937 |
-
{
|
| 938 |
-
"cell_type": "markdown",
|
| 939 |
-
"metadata": {},
|
| 940 |
-
"source": [
|
| 941 |
-
"# Let's test on our extracted image"
|
| 942 |
-
]
|
| 943 |
-
},
|
| 944 |
-
{
|
| 945 |
-
"cell_type": "code",
|
| 946 |
-
"execution_count": null,
|
| 947 |
-
"metadata": {},
|
| 948 |
-
"outputs": [
|
| 949 |
-
{
|
| 950 |
-
"name": "stdout",
|
| 951 |
-
"output_type": "stream",
|
| 952 |
-
"text": [
|
| 953 |
-
"5\n",
|
| 954 |
-
"3\n",
|
| 955 |
-
"5\n",
|
| 956 |
-
"5\n",
|
| 957 |
-
"2\n",
|
| 958 |
-
"2\n",
|
| 959 |
-
"0\n",
|
| 960 |
-
"3\n",
|
| 961 |
-
"2\n",
|
| 962 |
-
"3\n",
|
| 963 |
-
"9\n",
|
| 964 |
-
"0\n"
|
| 965 |
-
]
|
| 966 |
-
}
|
| 967 |
-
],
|
| 968 |
-
"source": [
|
| 969 |
-
"def x_cord_contour(contours):\n",
|
| 970 |
-
" #Returns the X cordinate for the contour centroid\n",
|
| 971 |
-
" if cv2.contourArea(contours) > 10:\n",
|
| 972 |
-
" M = cv2.moments(contours)\n",
|
| 973 |
-
" return (int(M['m10']/M['m00']))\n",
|
| 974 |
-
" else:\n",
|
| 975 |
-
" pass\n",
|
| 976 |
-
"\n",
|
| 977 |
-
"img = cv2.imread('credit_card_extracted_digits.jpg')\n",
|
| 978 |
-
"orig_img = cv2.imread('credit_card_color.jpg')\n",
|
| 979 |
-
"gray = cv2.cvtColor(img,cv2.COLOR_BGR2GRAY)\n",
|
| 980 |
-
"cv2.imshow(\"image\", img)\n",
|
| 981 |
-
"cv2.waitKey(0)\n",
|
| 982 |
-
"\n",
|
| 983 |
-
"# Blur image then find edges using Canny \n",
|
| 984 |
-
"blurred = cv2.GaussianBlur(gray, (5, 5), 0)\n",
|
| 985 |
-
"#cv2.imshow(\"blurred\", blurred)\n",
|
| 986 |
-
"#cv2.waitKey(0)\n",
|
| 987 |
-
"\n",
|
| 988 |
-
"edged = cv2.Canny(blurred, 30, 150)\n",
|
| 989 |
-
"#cv2.imshow(\"edged\", edged)\n",
|
| 990 |
-
"#cv2.waitKey(0)\n",
|
| 991 |
-
"\n",
|
| 992 |
-
"# Find Contours\n",
|
| 993 |
-
"_, contours, _ = cv2.findContours(edged.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n",
|
| 994 |
-
"\n",
|
| 995 |
-
"#Sort out contours left to right by using their x cordinates\n",
|
| 996 |
-
"contours = sorted(contours, key=cv2.contourArea, reverse=True)[:13] #Change this to 16 to get all digits\n",
|
| 997 |
-
"contours = sorted(contours, key = x_cord_contour, reverse = False)\n",
|
| 998 |
-
"\n",
|
| 999 |
-
"# Create empty array to store entire number\n",
|
| 1000 |
-
"full_number = []\n",
|
| 1001 |
-
"\n",
|
| 1002 |
-
"# loop over the contours\n",
|
| 1003 |
-
"for c in contours:\n",
|
| 1004 |
-
" # compute the bounding box for the rectangle\n",
|
| 1005 |
-
" (x, y, w, h) = cv2.boundingRect(c) \n",
|
| 1006 |
-
" if w >= 5 and h >= 25 and cv2.contourArea(c) < 1000:\n",
|
| 1007 |
-
" roi = blurred[y:y + h, x:x + w]\n",
|
| 1008 |
-
" #ret, roi = cv2.threshold(roi, 20, 255,cv2.THRESH_BINARY_INV)\n",
|
| 1009 |
-
" cv2.imshow(\"ROI1\", roi)\n",
|
| 1010 |
-
" roi_otsu = pre_process(roi, True)\n",
|
| 1011 |
-
" cv2.imshow(\"ROI2\", roi_otsu)\n",
|
| 1012 |
-
" roi_otsu = cv2.cvtColor(roi_otsu, cv2.COLOR_GRAY2RGB)\n",
|
| 1013 |
-
" roi_otsu = keras.preprocessing.image.img_to_array(roi_otsu)\n",
|
| 1014 |
-
" roi_otsu = roi_otsu * 1./255\n",
|
| 1015 |
-
" roi_otsu = np.expand_dims(roi_otsu, axis=0)\n",
|
| 1016 |
-
" image = np.vstack([roi_otsu])\n",
|
| 1017 |
-
" label = str(classifier.predict_classes(image, batch_size = 10))[1]\n",
|
| 1018 |
-
" print(label)\n",
|
| 1019 |
-
" (x, y, w, h) = (x+region[0][0], y+region[0][1], w, h)\n",
|
| 1020 |
-
" cv2.rectangle(orig_img, (x, y), (x + w, y + h), (0, 255, 0), 2)\n",
|
| 1021 |
-
" cv2.putText(orig_img, label, (x , y + 90), cv2.FONT_HERSHEY_COMPLEX, 2, (0, 255, 0), 2)\n",
|
| 1022 |
-
" cv2.imshow(\"image\", orig_img)\n",
|
| 1023 |
-
" cv2.waitKey(0) \n",
|
| 1024 |
-
" \n",
|
| 1025 |
-
"cv2.destroyAllWindows()"
|
| 1026 |
-
]
|
| 1027 |
-
},
|
| 1028 |
-
{
|
| 1029 |
-
"cell_type": "code",
|
| 1030 |
-
"execution_count": 11,
|
| 1031 |
-
"metadata": {},
|
| 1032 |
-
"outputs": [],
|
| 1033 |
-
"source": [
|
| 1034 |
-
"cv2.destroyAllWindows()"
|
| 1035 |
-
]
|
| 1036 |
-
},
|
| 1037 |
-
{
|
| 1038 |
-
"cell_type": "code",
|
| 1039 |
-
"execution_count": null,
|
| 1040 |
-
"metadata": {},
|
| 1041 |
-
"outputs": [],
|
| 1042 |
-
"source": []
|
| 1043 |
-
}
|
| 1044 |
-
],
|
| 1045 |
-
"metadata": {
|
| 1046 |
-
"kernelspec": {
|
| 1047 |
-
"display_name": "Python 3",
|
| 1048 |
-
"language": "python",
|
| 1049 |
-
"name": "python3"
|
| 1050 |
-
},
|
| 1051 |
-
"language_info": {
|
| 1052 |
-
"codemirror_mode": {
|
| 1053 |
-
"name": "ipython",
|
| 1054 |
-
"version": 3
|
| 1055 |
-
},
|
| 1056 |
-
"file_extension": ".py",
|
| 1057 |
-
"mimetype": "text/x-python",
|
| 1058 |
-
"name": "python",
|
| 1059 |
-
"nbconvert_exporter": "python",
|
| 1060 |
-
"pygments_lexer": "ipython3",
|
| 1061 |
-
"version": "3.7.4"
|
| 1062 |
-
}
|
| 1063 |
-
},
|
| 1064 |
-
"nbformat": 4,
|
| 1065 |
-
"nbformat_minor": 2
|
| 1066 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:eece39cacf3b346d8b9ed53f038faa455b81f938093f52f89a68a592e5949f95
|
| 3 |
+
size 40204
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/.ipynb_checkpoints/4.1 - Handwritten Digit Classification Demo (MNIST)-checkpoint.ipynb
CHANGED
|
@@ -1,327 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Let's load a Handwritten Digit classifier we'll be building very soon!"
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "code",
|
| 12 |
-
"execution_count": 3,
|
| 13 |
-
"metadata": {},
|
| 14 |
-
"outputs": [],
|
| 15 |
-
"source": [
|
| 16 |
-
"import cv2\n",
|
| 17 |
-
"import numpy as np\n",
|
| 18 |
-
"from keras.datasets import mnist\n",
|
| 19 |
-
"from keras.models import load_model\n",
|
| 20 |
-
"\n",
|
| 21 |
-
"classifier = load_model('/home/deeplearningcv/DeepLearningCV/Trained Models/mnist_simple_cnn.h5')\n",
|
| 22 |
-
"\n",
|
| 23 |
-
"# loads the MNIST dataset\n",
|
| 24 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 25 |
-
"\n",
|
| 26 |
-
"def draw_test(name, pred, input_im):\n",
|
| 27 |
-
" BLACK = [0,0,0]\n",
|
| 28 |
-
" expanded_image = cv2.copyMakeBorder(input_im, 0, 0, 0, imageL.shape[0] ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 29 |
-
" expanded_image = cv2.cvtColor(expanded_image, cv2.COLOR_GRAY2BGR)\n",
|
| 30 |
-
" cv2.putText(expanded_image, str(pred), (152, 70) , cv2.FONT_HERSHEY_COMPLEX_SMALL,4, (0,255,0), 2)\n",
|
| 31 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 32 |
-
"\n",
|
| 33 |
-
"for i in range(0,10):\n",
|
| 34 |
-
" rand = np.random.randint(0,len(x_test))\n",
|
| 35 |
-
" input_im = x_test[rand]\n",
|
| 36 |
-
"\n",
|
| 37 |
-
" imageL = cv2.resize(input_im, None, fx=4, fy=4, interpolation = cv2.INTER_CUBIC) \n",
|
| 38 |
-
" input_im = input_im.reshape(1,28,28,1) \n",
|
| 39 |
-
" \n",
|
| 40 |
-
" ## Get Prediction\n",
|
| 41 |
-
" res = str(classifier.predict_classes(input_im, 1, verbose = 0)[0])\n",
|
| 42 |
-
" draw_test(\"Prediction\", res, imageL) \n",
|
| 43 |
-
" cv2.waitKey(0)\n",
|
| 44 |
-
"\n",
|
| 45 |
-
"cv2.destroyAllWindows()"
|
| 46 |
-
]
|
| 47 |
-
},
|
| 48 |
-
{
|
| 49 |
-
"cell_type": "code",
|
| 50 |
-
"execution_count": null,
|
| 51 |
-
"metadata": {},
|
| 52 |
-
"outputs": [],
|
| 53 |
-
"source": []
|
| 54 |
-
},
|
| 55 |
-
{
|
| 56 |
-
"cell_type": "code",
|
| 57 |
-
"execution_count": 5,
|
| 58 |
-
"metadata": {},
|
| 59 |
-
"outputs": [
|
| 60 |
-
{
|
| 61 |
-
"ename": "TypeError",
|
| 62 |
-
"evalue": "'<' not supported between instances of 'NoneType' and 'NoneType'",
|
| 63 |
-
"output_type": "error",
|
| 64 |
-
"traceback": [
|
| 65 |
-
"\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
|
| 66 |
-
"\u001b[0;31mTypeError\u001b[0m Traceback (most recent call last)",
|
| 67 |
-
"\u001b[0;32m<ipython-input-5-ad16a07b4c0b>\u001b[0m in \u001b[0;36m<module>\u001b[0;34m\u001b[0m\n\u001b[1;32m 22\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 23\u001b[0m \u001b[0;31m#Sort out contours left to right by using their x cordinates\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 24\u001b[0;31m \u001b[0mcontours\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0msorted\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcontours\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mkey\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mx_cord_contour\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mreverse\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;32mFalse\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 25\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 26\u001b[0m \u001b[0;31m# Create empty array to store entire number\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
|
| 68 |
-
"\u001b[0;31mTypeError\u001b[0m: '<' not supported between instances of 'NoneType' and 'NoneType'"
|
| 69 |
-
]
|
| 70 |
-
}
|
| 71 |
-
],
|
| 72 |
-
"source": [
|
| 73 |
-
"import numpy as np\n",
|
| 74 |
-
"import cv2\n",
|
| 75 |
-
"from preprocessors import x_cord_contour, makeSquare, resize_to_pixel\n",
|
| 76 |
-
"\n",
|
| 77 |
-
"image = cv2.imread('images/numbers.jpg')\n",
|
| 78 |
-
"gray = cv2.cvtColor(image,cv2.COLOR_BGR2GRAY)\n",
|
| 79 |
-
"cv2.imshow(\"image\", image)\n",
|
| 80 |
-
"cv2.imshow(\"gray\", gray)\n",
|
| 81 |
-
"cv2.waitKey(0)\n",
|
| 82 |
-
"\n",
|
| 83 |
-
"# Blur image then find edges using Canny \n",
|
| 84 |
-
"blurred = cv2.GaussianBlur(gray, (5, 5), 0)\n",
|
| 85 |
-
"cv2.imshow(\"blurred\", blurred)\n",
|
| 86 |
-
"cv2.waitKey(0)\n",
|
| 87 |
-
"\n",
|
| 88 |
-
"edged = cv2.Canny(blurred, 30, 150)\n",
|
| 89 |
-
"cv2.imshow(\"edged\", edged)\n",
|
| 90 |
-
"cv2.waitKey(0)\n",
|
| 91 |
-
"\n",
|
| 92 |
-
"# Fint Contours\n",
|
| 93 |
-
"_, contours, _ = cv2.findContours(edged.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n",
|
| 94 |
-
"\n",
|
| 95 |
-
"#Sort out contours left to right by using their x cordinates\n",
|
| 96 |
-
"contours = sorted(contours, key = x_cord_contour, reverse = False)\n",
|
| 97 |
-
"\n",
|
| 98 |
-
"# Create empty array to store entire number\n",
|
| 99 |
-
"full_number = []\n",
|
| 100 |
-
"\n",
|
| 101 |
-
"# loop over the contours\n",
|
| 102 |
-
"for c in contours:\n",
|
| 103 |
-
" # compute the bounding box for the rectangle\n",
|
| 104 |
-
" (x, y, w, h) = cv2.boundingRect(c) \n",
|
| 105 |
-
" \n",
|
| 106 |
-
" #cv2.drawContours(image, contours, -1, (0,255,0), 3)\n",
|
| 107 |
-
" #cv2.imshow(\"Contours\", image)\n",
|
| 108 |
-
"\n",
|
| 109 |
-
" if w >= 5 and h >= 25:\n",
|
| 110 |
-
" roi = blurred[y:y + h, x:x + w]\n",
|
| 111 |
-
" ret, roi = cv2.threshold(roi, 127, 255,cv2.THRESH_BINARY_INV)\n",
|
| 112 |
-
" squared = makeSquare(roi)\n",
|
| 113 |
-
" final = resize_to_pixel(20, squared)\n",
|
| 114 |
-
" cv2.imshow(\"final\", final)\n",
|
| 115 |
-
" final_array = final.reshape((1,400))\n",
|
| 116 |
-
" final_array = final_array.astype(np.float32)\n",
|
| 117 |
-
" #ret, result, neighbours, dist = knn.find_nearest(final_array, k=1)\n",
|
| 118 |
-
" #number = str(int(float(result[0])))\n",
|
| 119 |
-
" #full_number.append(number)\n",
|
| 120 |
-
" # draw a rectangle around the digit, the show what the\n",
|
| 121 |
-
" # digit was classified as\n",
|
| 122 |
-
" #cv2.rectangle(image, (x, y), (x + w, y + h), (0, 0, 255), 2)\n",
|
| 123 |
-
" #cv2.putText(image, number, (x , y + 155),\n",
|
| 124 |
-
" # cv2.FONT_HERSHEY_COMPLEX, 2, (255, 0, 0), 2)\n",
|
| 125 |
-
" cv2.imshow(\"image\", image)\n",
|
| 126 |
-
" cv2.waitKey(0) \n",
|
| 127 |
-
" \n",
|
| 128 |
-
"cv2.destroyAllWindows()\n",
|
| 129 |
-
"print (\"The number is: \" + ''.join(full_number))"
|
| 130 |
-
]
|
| 131 |
-
},
|
| 132 |
-
{
|
| 133 |
-
"cell_type": "code",
|
| 134 |
-
"execution_count": 6,
|
| 135 |
-
"metadata": {},
|
| 136 |
-
"outputs": [],
|
| 137 |
-
"source": [
|
| 138 |
-
"cv2.destroyAllWindows()"
|
| 139 |
-
]
|
| 140 |
-
},
|
| 141 |
-
{
|
| 142 |
-
"cell_type": "code",
|
| 143 |
-
"execution_count": null,
|
| 144 |
-
"metadata": {},
|
| 145 |
-
"outputs": [],
|
| 146 |
-
"source": []
|
| 147 |
-
},
|
| 148 |
-
{
|
| 149 |
-
"cell_type": "code",
|
| 150 |
-
"execution_count": 1,
|
| 151 |
-
"metadata": {},
|
| 152 |
-
"outputs": [
|
| 153 |
-
{
|
| 154 |
-
"name": "stderr",
|
| 155 |
-
"output_type": "stream",
|
| 156 |
-
"text": [
|
| 157 |
-
"Using TensorFlow backend.\n"
|
| 158 |
-
]
|
| 159 |
-
},
|
| 160 |
-
{
|
| 161 |
-
"name": "stdout",
|
| 162 |
-
"output_type": "stream",
|
| 163 |
-
"text": [
|
| 164 |
-
"x_train shape: (60000, 28, 28, 1)\n",
|
| 165 |
-
"60000 train samples\n",
|
| 166 |
-
"10000 test samples\n",
|
| 167 |
-
"Number of Classes: 10\n",
|
| 168 |
-
"_________________________________________________________________\n",
|
| 169 |
-
"Layer (type) Output Shape Param # \n",
|
| 170 |
-
"=================================================================\n",
|
| 171 |
-
"conv2d_1 (Conv2D) (None, 26, 26, 32) 320 \n",
|
| 172 |
-
"_________________________________________________________________\n",
|
| 173 |
-
"conv2d_2 (Conv2D) (None, 24, 24, 64) 18496 \n",
|
| 174 |
-
"_________________________________________________________________\n",
|
| 175 |
-
"max_pooling2d_1 (MaxPooling2 (None, 12, 12, 64) 0 \n",
|
| 176 |
-
"_________________________________________________________________\n",
|
| 177 |
-
"dropout_1 (Dropout) (None, 12, 12, 64) 0 \n",
|
| 178 |
-
"_________________________________________________________________\n",
|
| 179 |
-
"flatten_1 (Flatten) (None, 9216) 0 \n",
|
| 180 |
-
"_________________________________________________________________\n",
|
| 181 |
-
"dense_1 (Dense) (None, 128) 1179776 \n",
|
| 182 |
-
"_________________________________________________________________\n",
|
| 183 |
-
"dropout_2 (Dropout) (None, 128) 0 \n",
|
| 184 |
-
"_________________________________________________________________\n",
|
| 185 |
-
"dense_2 (Dense) (None, 10) 1290 \n",
|
| 186 |
-
"=================================================================\n",
|
| 187 |
-
"Total params: 1,199,882\n",
|
| 188 |
-
"Trainable params: 1,199,882\n",
|
| 189 |
-
"Non-trainable params: 0\n",
|
| 190 |
-
"_________________________________________________________________\n",
|
| 191 |
-
"None\n",
|
| 192 |
-
"Train on 60000 samples, validate on 10000 samples\n",
|
| 193 |
-
"Epoch 1/5\n",
|
| 194 |
-
"60000/60000 [==============================] - 189s 3ms/step - loss: 0.2667 - acc: 0.9187 - val_loss: 0.0556 - val_acc: 0.9819\n",
|
| 195 |
-
"Epoch 2/5\n",
|
| 196 |
-
"60000/60000 [==============================] - 170s 3ms/step - loss: 0.0889 - acc: 0.9740 - val_loss: 0.0396 - val_acc: 0.9864\n",
|
| 197 |
-
"Epoch 3/5\n",
|
| 198 |
-
"60000/60000 [==============================] - 192s 3ms/step - loss: 0.0670 - acc: 0.9805 - val_loss: 0.0350 - val_acc: 0.9884\n",
|
| 199 |
-
"Epoch 4/5\n",
|
| 200 |
-
"60000/60000 [==============================] - 194s 3ms/step - loss: 0.0553 - acc: 0.9838 - val_loss: 0.0321 - val_acc: 0.9897\n",
|
| 201 |
-
"Epoch 5/5\n",
|
| 202 |
-
"60000/60000 [==============================] - 204s 3ms/step - loss: 0.0464 - acc: 0.9856 - val_loss: 0.0282 - val_acc: 0.9907\n",
|
| 203 |
-
"Test loss: 0.028205487172584982\n",
|
| 204 |
-
"Test accuracy: 0.9907\n"
|
| 205 |
-
]
|
| 206 |
-
}
|
| 207 |
-
],
|
| 208 |
-
"source": [
|
| 209 |
-
"from keras.datasets import mnist\n",
|
| 210 |
-
"from keras.utils import np_utils\n",
|
| 211 |
-
"import keras\n",
|
| 212 |
-
"from keras.datasets import mnist\n",
|
| 213 |
-
"from keras.models import Sequential\n",
|
| 214 |
-
"from keras.layers import Dense, Dropout, Flatten\n",
|
| 215 |
-
"from keras.layers import Conv2D, MaxPooling2D\n",
|
| 216 |
-
"from keras import backend as K\n",
|
| 217 |
-
"\n",
|
| 218 |
-
"# Training Parameters\n",
|
| 219 |
-
"batch_size = 128\n",
|
| 220 |
-
"epochs = 5\n",
|
| 221 |
-
"\n",
|
| 222 |
-
"# loads the MNIST dataset\n",
|
| 223 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 224 |
-
"\n",
|
| 225 |
-
"# Lets store the number of rows and columns\n",
|
| 226 |
-
"img_rows = x_train[0].shape[0]\n",
|
| 227 |
-
"img_cols = x_train[1].shape[0]\n",
|
| 228 |
-
"\n",
|
| 229 |
-
"# Getting our date in the right 'shape' needed for Keras\n",
|
| 230 |
-
"# We need to add a 4th dimenion to our date thereby changing our\n",
|
| 231 |
-
"# Our original image shape of (60000,28,28) to (60000,28,28,1)\n",
|
| 232 |
-
"x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n",
|
| 233 |
-
"x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n",
|
| 234 |
-
"\n",
|
| 235 |
-
"# store the shape of a single image \n",
|
| 236 |
-
"input_shape = (img_rows, img_cols, 1)\n",
|
| 237 |
-
"\n",
|
| 238 |
-
"# change our image type to float32 data type\n",
|
| 239 |
-
"x_train = x_train.astype('float32')\n",
|
| 240 |
-
"x_test = x_test.astype('float32')\n",
|
| 241 |
-
"\n",
|
| 242 |
-
"# Normalize our data by changing the range from (0 to 255) to (0 to 1)\n",
|
| 243 |
-
"x_train /= 255\n",
|
| 244 |
-
"x_test /= 255\n",
|
| 245 |
-
"\n",
|
| 246 |
-
"print('x_train shape:', x_train.shape)\n",
|
| 247 |
-
"print(x_train.shape[0], 'train samples')\n",
|
| 248 |
-
"print(x_test.shape[0], 'test samples')\n",
|
| 249 |
-
"\n",
|
| 250 |
-
"# Now we one hot encode outputs\n",
|
| 251 |
-
"y_train = np_utils.to_categorical(y_train)\n",
|
| 252 |
-
"y_test = np_utils.to_categorical(y_test)\n",
|
| 253 |
-
"\n",
|
| 254 |
-
"# Let's count the number columns in our hot encoded matrix \n",
|
| 255 |
-
"print (\"Number of Classes: \" + str(y_test.shape[1]))\n",
|
| 256 |
-
"\n",
|
| 257 |
-
"num_classes = y_test.shape[1]\n",
|
| 258 |
-
"num_pixels = x_train.shape[1] * x_train.shape[2]\n",
|
| 259 |
-
"\n",
|
| 260 |
-
"# create model\n",
|
| 261 |
-
"model = Sequential()\n",
|
| 262 |
-
"\n",
|
| 263 |
-
"model.add(Conv2D(32, kernel_size=(3, 3),\n",
|
| 264 |
-
" activation='relu',\n",
|
| 265 |
-
" input_shape=input_shape))\n",
|
| 266 |
-
"model.add(Conv2D(64, (3, 3), activation='relu'))\n",
|
| 267 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 268 |
-
"model.add(Dropout(0.25))\n",
|
| 269 |
-
"model.add(Flatten())\n",
|
| 270 |
-
"model.add(Dense(128, activation='relu'))\n",
|
| 271 |
-
"model.add(Dropout(0.5))\n",
|
| 272 |
-
"model.add(Dense(num_classes, activation='softmax'))\n",
|
| 273 |
-
"\n",
|
| 274 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 275 |
-
" optimizer = keras.optimizers.Adadelta(),\n",
|
| 276 |
-
" metrics = ['accuracy'])\n",
|
| 277 |
-
"\n",
|
| 278 |
-
"print(model.summary())\n",
|
| 279 |
-
"\n",
|
| 280 |
-
"history = model.fit(x_train, y_train,\n",
|
| 281 |
-
" batch_size=batch_size,\n",
|
| 282 |
-
" epochs=epochs,\n",
|
| 283 |
-
" verbose=1,\n",
|
| 284 |
-
" validation_data=(x_test, y_test))\n",
|
| 285 |
-
"\n",
|
| 286 |
-
"score = model.evaluate(x_test, y_test, verbose=0)\n",
|
| 287 |
-
"print('Test loss:', score[0])\n",
|
| 288 |
-
"print('Test accuracy:', score[1])"
|
| 289 |
-
]
|
| 290 |
-
},
|
| 291 |
-
{
|
| 292 |
-
"cell_type": "code",
|
| 293 |
-
"execution_count": null,
|
| 294 |
-
"metadata": {},
|
| 295 |
-
"outputs": [],
|
| 296 |
-
"source": []
|
| 297 |
-
},
|
| 298 |
-
{
|
| 299 |
-
"cell_type": "code",
|
| 300 |
-
"execution_count": null,
|
| 301 |
-
"metadata": {},
|
| 302 |
-
"outputs": [],
|
| 303 |
-
"source": []
|
| 304 |
-
}
|
| 305 |
-
],
|
| 306 |
-
"metadata": {
|
| 307 |
-
"kernelspec": {
|
| 308 |
-
"display_name": "Python 3",
|
| 309 |
-
"language": "python",
|
| 310 |
-
"name": "python3"
|
| 311 |
-
},
|
| 312 |
-
"language_info": {
|
| 313 |
-
"codemirror_mode": {
|
| 314 |
-
"name": "ipython",
|
| 315 |
-
"version": 3
|
| 316 |
-
},
|
| 317 |
-
"file_extension": ".py",
|
| 318 |
-
"mimetype": "text/x-python",
|
| 319 |
-
"name": "python",
|
| 320 |
-
"nbconvert_exporter": "python",
|
| 321 |
-
"pygments_lexer": "ipython3",
|
| 322 |
-
"version": "3.6.6"
|
| 323 |
-
}
|
| 324 |
-
},
|
| 325 |
-
"nbformat": 4,
|
| 326 |
-
"nbformat_minor": 2
|
| 327 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:3e98445db8b33fc0875182c4519f938679b0dd2d523bc831e2f49178cc1716c7
|
| 3 |
+
size 12870
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/.ipynb_checkpoints/4.2 - Image Classifier - CIFAR10-checkpoint.ipynb
CHANGED
|
@@ -1,6 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"nbformat": 4,
|
| 5 |
-
"nbformat_minor": 2
|
| 6 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:188143f20ba64ead32853235735c589180056b6c7c47541744479767de696e37
|
| 3 |
+
size 72
|
|
|
|
|
|
|
|
|
4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/.ipynb_checkpoints/4.3. Live Sketching-checkpoint.ipynb
CHANGED
|
@@ -1,101 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "code",
|
| 5 |
-
"execution_count": 1,
|
| 6 |
-
"metadata": {},
|
| 7 |
-
"outputs": [
|
| 8 |
-
{
|
| 9 |
-
"name": "stderr",
|
| 10 |
-
"output_type": "stream",
|
| 11 |
-
"text": [
|
| 12 |
-
"Using TensorFlow backend.\n"
|
| 13 |
-
]
|
| 14 |
-
},
|
| 15 |
-
{
|
| 16 |
-
"name": "stdout",
|
| 17 |
-
"output_type": "stream",
|
| 18 |
-
"text": [
|
| 19 |
-
"3.4.3\n"
|
| 20 |
-
]
|
| 21 |
-
}
|
| 22 |
-
],
|
| 23 |
-
"source": [
|
| 24 |
-
"import keras\n",
|
| 25 |
-
"import cv2\n",
|
| 26 |
-
"import numpy as np\n",
|
| 27 |
-
"import matplotlib\n",
|
| 28 |
-
"print (cv2.__version__)"
|
| 29 |
-
]
|
| 30 |
-
},
|
| 31 |
-
{
|
| 32 |
-
"cell_type": "code",
|
| 33 |
-
"execution_count": 3,
|
| 34 |
-
"metadata": {},
|
| 35 |
-
"outputs": [],
|
| 36 |
-
"source": [
|
| 37 |
-
"import cv2\n",
|
| 38 |
-
"import numpy as np\n",
|
| 39 |
-
"\n",
|
| 40 |
-
"# Our sketch generating function\n",
|
| 41 |
-
"def sketch(image):\n",
|
| 42 |
-
" # Convert image to grayscale\n",
|
| 43 |
-
" img_gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)\n",
|
| 44 |
-
" \n",
|
| 45 |
-
" # Clean up image using Guassian Blur\n",
|
| 46 |
-
" img_gray_blur = cv2.GaussianBlur(img_gray, (5,5), 0)\n",
|
| 47 |
-
" \n",
|
| 48 |
-
" # Extract edges\n",
|
| 49 |
-
" canny_edges = cv2.Canny(img_gray_blur, 10, 70)\n",
|
| 50 |
-
" \n",
|
| 51 |
-
" # Do an invert binarize the image \n",
|
| 52 |
-
" ret, mask = cv2.threshold(canny_edges, 70, 255, cv2.THRESH_BINARY_INV)\n",
|
| 53 |
-
" return mask\n",
|
| 54 |
-
"\n",
|
| 55 |
-
"\n",
|
| 56 |
-
"# Initialize webcam, cap is the object provided by VideoCapture\n",
|
| 57 |
-
"# It contains a boolean indicating if it was sucessful (ret)\n",
|
| 58 |
-
"# It also contains the images collected from the webcam (frame)\n",
|
| 59 |
-
"cap = cv2.VideoCapture(0)\n",
|
| 60 |
-
"\n",
|
| 61 |
-
"while True:\n",
|
| 62 |
-
" ret, frame = cap.read()\n",
|
| 63 |
-
" cv2.imshow('Our Live Sketcher', sketch(frame))\n",
|
| 64 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 65 |
-
" break\n",
|
| 66 |
-
" \n",
|
| 67 |
-
"# Release camera and close windows\n",
|
| 68 |
-
"cap.release()\n",
|
| 69 |
-
"cv2.destroyAllWindows() "
|
| 70 |
-
]
|
| 71 |
-
},
|
| 72 |
-
{
|
| 73 |
-
"cell_type": "code",
|
| 74 |
-
"execution_count": null,
|
| 75 |
-
"metadata": {},
|
| 76 |
-
"outputs": [],
|
| 77 |
-
"source": []
|
| 78 |
-
}
|
| 79 |
-
],
|
| 80 |
-
"metadata": {
|
| 81 |
-
"kernelspec": {
|
| 82 |
-
"display_name": "Python 3",
|
| 83 |
-
"language": "python",
|
| 84 |
-
"name": "python3"
|
| 85 |
-
},
|
| 86 |
-
"language_info": {
|
| 87 |
-
"codemirror_mode": {
|
| 88 |
-
"name": "ipython",
|
| 89 |
-
"version": 3
|
| 90 |
-
},
|
| 91 |
-
"file_extension": ".py",
|
| 92 |
-
"mimetype": "text/x-python",
|
| 93 |
-
"name": "python",
|
| 94 |
-
"nbconvert_exporter": "python",
|
| 95 |
-
"pygments_lexer": "ipython3",
|
| 96 |
-
"version": "3.6.6"
|
| 97 |
-
}
|
| 98 |
-
},
|
| 99 |
-
"nbformat": 4,
|
| 100 |
-
"nbformat_minor": 2
|
| 101 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:7713010cc1e5df2a16e913547808690eb87efac3b2926f632cc6c0ee304e408f
|
| 3 |
+
size 2383
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/.ipynb_checkpoints/Test - Imports Keras, OpenCV and tests webcam-checkpoint.ipynb
CHANGED
|
@@ -1,73 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "code",
|
| 5 |
-
"execution_count": 4,
|
| 6 |
-
"metadata": {},
|
| 7 |
-
"outputs": [],
|
| 8 |
-
"source": [
|
| 9 |
-
"import cv2\n",
|
| 10 |
-
"import numpy as np\n",
|
| 11 |
-
"\n",
|
| 12 |
-
"# Our sketch generating function\n",
|
| 13 |
-
"def sketch(image):\n",
|
| 14 |
-
" # Convert image to grayscale\n",
|
| 15 |
-
" img_gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)\n",
|
| 16 |
-
" \n",
|
| 17 |
-
" # Clean up image using Guassian Blur\n",
|
| 18 |
-
" img_gray_blur = cv2.GaussianBlur(img_gray, (5,5), 0)\n",
|
| 19 |
-
" \n",
|
| 20 |
-
" # Extract edges\n",
|
| 21 |
-
" canny_edges = cv2.Canny(img_gray_blur, 10, 70)\n",
|
| 22 |
-
" \n",
|
| 23 |
-
" # Do an invert binarize the image \n",
|
| 24 |
-
" ret, mask = cv2.threshold(canny_edges, 70, 255, cv2.THRESH_BINARY_INV)\n",
|
| 25 |
-
" return mask\n",
|
| 26 |
-
"\n",
|
| 27 |
-
"\n",
|
| 28 |
-
"# Initialize webcam, cap is the object provided by VideoCapture\n",
|
| 29 |
-
"# It contains a boolean indicating if it was sucessful (ret)\n",
|
| 30 |
-
"# It also contains the images collected from the webcam (frame)\n",
|
| 31 |
-
"cap = cv2.VideoCapture(0)\n",
|
| 32 |
-
"\n",
|
| 33 |
-
"while True:\n",
|
| 34 |
-
" ret, frame = cap.read()\n",
|
| 35 |
-
" cv2.imshow('Our Live Sketcher', sketch(frame))\n",
|
| 36 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 37 |
-
" break\n",
|
| 38 |
-
" \n",
|
| 39 |
-
"# Release camera and close windows\n",
|
| 40 |
-
"cap.release()\n",
|
| 41 |
-
"cv2.destroyAllWindows() "
|
| 42 |
-
]
|
| 43 |
-
},
|
| 44 |
-
{
|
| 45 |
-
"cell_type": "code",
|
| 46 |
-
"execution_count": null,
|
| 47 |
-
"metadata": {},
|
| 48 |
-
"outputs": [],
|
| 49 |
-
"source": []
|
| 50 |
-
}
|
| 51 |
-
],
|
| 52 |
-
"metadata": {
|
| 53 |
-
"kernelspec": {
|
| 54 |
-
"display_name": "Python 3",
|
| 55 |
-
"language": "python",
|
| 56 |
-
"name": "python3"
|
| 57 |
-
},
|
| 58 |
-
"language_info": {
|
| 59 |
-
"codemirror_mode": {
|
| 60 |
-
"name": "ipython",
|
| 61 |
-
"version": 3
|
| 62 |
-
},
|
| 63 |
-
"file_extension": ".py",
|
| 64 |
-
"mimetype": "text/x-python",
|
| 65 |
-
"name": "python",
|
| 66 |
-
"nbconvert_exporter": "python",
|
| 67 |
-
"pygments_lexer": "ipython3",
|
| 68 |
-
"version": "3.6.6"
|
| 69 |
-
}
|
| 70 |
-
},
|
| 71 |
-
"nbformat": 4,
|
| 72 |
-
"nbformat_minor": 2
|
| 73 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:49b0db58562700ccd3146af9fd7d72a4c2f10478596a2a99321db8d06ffe7bdf
|
| 3 |
+
size 1910
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/4.1 - Handwritten Digit Classification Demo (MNIST).ipynb
CHANGED
|
@@ -1,656 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Let's load a Handwritten Digit classifier we'll be building very soon!"
|
| 8 |
-
]
|
| 9 |
-
},
|
| 10 |
-
{
|
| 11 |
-
"cell_type": "code",
|
| 12 |
-
"metadata": {
|
| 13 |
-
"ExecuteTime": {
|
| 14 |
-
"end_time": "2025-03-13T16:02:55.674903Z",
|
| 15 |
-
"start_time": "2025-03-13T16:02:41.962641Z"
|
| 16 |
-
}
|
| 17 |
-
},
|
| 18 |
-
"source": [
|
| 19 |
-
"import cv2\n",
|
| 20 |
-
"import numpy as np\n",
|
| 21 |
-
"import tensorflow as tf\n",
|
| 22 |
-
"import tensorflow.keras\n",
|
| 23 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 24 |
-
"from tensorflow.keras.models import load_model\n",
|
| 25 |
-
"\n"
|
| 26 |
-
],
|
| 27 |
-
"outputs": [
|
| 28 |
-
{
|
| 29 |
-
"name": "stderr",
|
| 30 |
-
"output_type": "stream",
|
| 31 |
-
"text": [
|
| 32 |
-
"2025-03-13 21:32:43.419874: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.\n",
|
| 33 |
-
"2025-03-13 21:32:43.428343: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.\n",
|
| 34 |
-
"2025-03-13 21:32:43.459714: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:477] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered\n",
|
| 35 |
-
"WARNING: All log messages before absl::InitializeLog() is called are written to STDERR\n",
|
| 36 |
-
"E0000 00:00:1741881763.520586 24229 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered\n",
|
| 37 |
-
"E0000 00:00:1741881763.535534 24229 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered\n",
|
| 38 |
-
"2025-03-13 21:32:43.581360: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.\n",
|
| 39 |
-
"To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.\n"
|
| 40 |
-
]
|
| 41 |
-
}
|
| 42 |
-
],
|
| 43 |
-
"execution_count": 1
|
| 44 |
-
},
|
| 45 |
-
{
|
| 46 |
-
"metadata": {
|
| 47 |
-
"ExecuteTime": {
|
| 48 |
-
"end_time": "2025-03-13T15:53:32.218716Z",
|
| 49 |
-
"start_time": "2025-03-13T15:53:31.940957Z"
|
| 50 |
-
}
|
| 51 |
-
},
|
| 52 |
-
"cell_type": "code",
|
| 53 |
-
"source": "classifier = load_model('mnist_simple_cnn.h5' )",
|
| 54 |
-
"outputs": [
|
| 55 |
-
{
|
| 56 |
-
"name": "stderr",
|
| 57 |
-
"output_type": "stream",
|
| 58 |
-
"text": [
|
| 59 |
-
"2025-03-13 21:23:31.971561: E external/local_xla/xla/stream_executor/cuda/cuda_driver.cc:152] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)\n",
|
| 60 |
-
"WARNING:absl:Compiled the loaded model, but the compiled metrics have yet to be built. `model.compile_metrics` will be empty until you train or evaluate the model.\n"
|
| 61 |
-
]
|
| 62 |
-
}
|
| 63 |
-
],
|
| 64 |
-
"execution_count": 2
|
| 65 |
-
},
|
| 66 |
-
{
|
| 67 |
-
"metadata": {
|
| 68 |
-
"jupyter": {
|
| 69 |
-
"is_executing": true
|
| 70 |
-
},
|
| 71 |
-
"ExecuteTime": {
|
| 72 |
-
"start_time": "2025-03-13T15:53:54.412494Z"
|
| 73 |
-
}
|
| 74 |
-
},
|
| 75 |
-
"cell_type": "code",
|
| 76 |
-
"source": [
|
| 77 |
-
"print(classifier.summary())\n",
|
| 78 |
-
"# loads the MNIST dataset\n",
|
| 79 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 80 |
-
"\n",
|
| 81 |
-
"def draw_test(name, pred, input_im):\n",
|
| 82 |
-
" BLACK = [0,0,0]\n",
|
| 83 |
-
" expanded_image = cv2.copyMakeBorder(input_im, 0, 0, 0, imageL.shape[0] ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 84 |
-
" expanded_image = cv2.cvtColor(expanded_image, cv2.COLOR_GRAY2BGR)\n",
|
| 85 |
-
" cv2.putText(expanded_image, str(pred), (152, 70) , cv2.FONT_HERSHEY_COMPLEX_SMALL,4, (0,255,0), 2)\n",
|
| 86 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 87 |
-
"\n",
|
| 88 |
-
"for i in range(0,10):\n",
|
| 89 |
-
" rand = np.random.randint(0,len(x_test))\n",
|
| 90 |
-
" input_im = x_test[rand]\n",
|
| 91 |
-
"\n",
|
| 92 |
-
" imageL = cv2.resize(input_im, None, fx=4, fy=4, interpolation = cv2.INTER_CUBIC)\n",
|
| 93 |
-
" input_im = input_im.reshape(1,28,28,1)\n",
|
| 94 |
-
"\n",
|
| 95 |
-
" ## Get Prediction\n",
|
| 96 |
-
" res = str(classifier.predict(input_im, 1, verbose = 0)[0])\n",
|
| 97 |
-
" draw_test(\"Prediction\", res, imageL)\n",
|
| 98 |
-
" cv2.waitKey(0)\n",
|
| 99 |
-
"\n",
|
| 100 |
-
"\n",
|
| 101 |
-
"cv2.destroyAllWindows()\n"
|
| 102 |
-
],
|
| 103 |
-
"outputs": [
|
| 104 |
-
{
|
| 105 |
-
"data": {
|
| 106 |
-
"text/plain": [
|
| 107 |
-
"\u001B[1mModel: \"sequential_3\"\u001B[0m\n"
|
| 108 |
-
],
|
| 109 |
-
"text/html": [
|
| 110 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">Model: \"sequential_3\"</span>\n",
|
| 111 |
-
"</pre>\n"
|
| 112 |
-
]
|
| 113 |
-
},
|
| 114 |
-
"metadata": {},
|
| 115 |
-
"output_type": "display_data"
|
| 116 |
-
},
|
| 117 |
-
{
|
| 118 |
-
"data": {
|
| 119 |
-
"text/plain": [
|
| 120 |
-
"┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
|
| 121 |
-
"┃\u001B[1m \u001B[0m\u001B[1mLayer (type) \u001B[0m\u001B[1m \u001B[0m┃\u001B[1m \u001B[0m\u001B[1mOutput Shape \u001B[0m\u001B[1m \u001B[0m┃\u001B[1m \u001B[0m\u001B[1m Param #\u001B[0m\u001B[1m \u001B[0m┃\n",
|
| 122 |
-
"┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
|
| 123 |
-
"│ conv2d_2 (\u001B[38;5;33mConv2D\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m26\u001B[0m, \u001B[38;5;34m26\u001B[0m, \u001B[38;5;34m32\u001B[0m) │ \u001B[38;5;34m320\u001B[0m │\n",
|
| 124 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 125 |
-
"│ conv2d_3 (\u001B[38;5;33mConv2D\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m24\u001B[0m, \u001B[38;5;34m24\u001B[0m, \u001B[38;5;34m64\u001B[0m) │ \u001B[38;5;34m18,496\u001B[0m │\n",
|
| 126 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 127 |
-
"│ max_pooling2d_1 (\u001B[38;5;33mMaxPooling2D\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m12\u001B[0m, \u001B[38;5;34m12\u001B[0m, \u001B[38;5;34m64\u001B[0m) │ \u001B[38;5;34m0\u001B[0m │\n",
|
| 128 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 129 |
-
"│ dropout_2 (\u001B[38;5;33mDropout\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m12\u001B[0m, \u001B[38;5;34m12\u001B[0m, \u001B[38;5;34m64\u001B[0m) │ \u001B[38;5;34m0\u001B[0m │\n",
|
| 130 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 131 |
-
"│ flatten_1 (\u001B[38;5;33mFlatten\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m9216\u001B[0m) │ \u001B[38;5;34m0\u001B[0m │\n",
|
| 132 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 133 |
-
"│ dense_2 (\u001B[38;5;33mDense\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m128\u001B[0m) │ \u001B[38;5;34m1,179,776\u001B[0m │\n",
|
| 134 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 135 |
-
"│ dropout_3 (\u001B[38;5;33mDropout\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m128\u001B[0m) │ \u001B[38;5;34m0\u001B[0m │\n",
|
| 136 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 137 |
-
"│ dense_3 (\u001B[38;5;33mDense\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m10\u001B[0m) │ \u001B[38;5;34m1,290\u001B[0m │\n",
|
| 138 |
-
"└─────────────────────────────────┴────────────────────────┴───────────────┘\n"
|
| 139 |
-
],
|
| 140 |
-
"text/html": [
|
| 141 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
|
| 142 |
-
"┃<span style=\"font-weight: bold\"> Layer (type) </span>┃<span style=\"font-weight: bold\"> Output Shape </span>┃<span style=\"font-weight: bold\"> Param # </span>┃\n",
|
| 143 |
-
"┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
|
| 144 |
-
"│ conv2d_2 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Conv2D</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">26</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">26</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">32</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">320</span> │\n",
|
| 145 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 146 |
-
"│ conv2d_3 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Conv2D</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">24</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">24</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">64</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">18,496</span> │\n",
|
| 147 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 148 |
-
"│ max_pooling2d_1 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">MaxPooling2D</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">12</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">12</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">64</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │\n",
|
| 149 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 150 |
-
"│ dropout_2 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Dropout</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">12</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">12</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">64</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │\n",
|
| 151 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 152 |
-
"│ flatten_1 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Flatten</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">9216</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │\n",
|
| 153 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 154 |
-
"│ dense_2 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Dense</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">128</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">1,179,776</span> │\n",
|
| 155 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 156 |
-
"│ dropout_3 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Dropout</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">128</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │\n",
|
| 157 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 158 |
-
"│ dense_3 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Dense</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">10</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">1,290</span> │\n",
|
| 159 |
-
"└──���──────────────────────────────┴────────────────────────┴───────────────┘\n",
|
| 160 |
-
"</pre>\n"
|
| 161 |
-
]
|
| 162 |
-
},
|
| 163 |
-
"metadata": {},
|
| 164 |
-
"output_type": "display_data"
|
| 165 |
-
},
|
| 166 |
-
{
|
| 167 |
-
"data": {
|
| 168 |
-
"text/plain": [
|
| 169 |
-
"\u001B[1m Total params: \u001B[0m\u001B[38;5;34m1,199,884\u001B[0m (4.58 MB)\n"
|
| 170 |
-
],
|
| 171 |
-
"text/html": [
|
| 172 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Total params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">1,199,884</span> (4.58 MB)\n",
|
| 173 |
-
"</pre>\n"
|
| 174 |
-
]
|
| 175 |
-
},
|
| 176 |
-
"metadata": {},
|
| 177 |
-
"output_type": "display_data"
|
| 178 |
-
},
|
| 179 |
-
{
|
| 180 |
-
"data": {
|
| 181 |
-
"text/plain": [
|
| 182 |
-
"\u001B[1m Trainable params: \u001B[0m\u001B[38;5;34m1,199,882\u001B[0m (4.58 MB)\n"
|
| 183 |
-
],
|
| 184 |
-
"text/html": [
|
| 185 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Trainable params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">1,199,882</span> (4.58 MB)\n",
|
| 186 |
-
"</pre>\n"
|
| 187 |
-
]
|
| 188 |
-
},
|
| 189 |
-
"metadata": {},
|
| 190 |
-
"output_type": "display_data"
|
| 191 |
-
},
|
| 192 |
-
{
|
| 193 |
-
"data": {
|
| 194 |
-
"text/plain": [
|
| 195 |
-
"\u001B[1m Non-trainable params: \u001B[0m\u001B[38;5;34m0\u001B[0m (0.00 B)\n"
|
| 196 |
-
],
|
| 197 |
-
"text/html": [
|
| 198 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Non-trainable params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> (0.00 B)\n",
|
| 199 |
-
"</pre>\n"
|
| 200 |
-
]
|
| 201 |
-
},
|
| 202 |
-
"metadata": {},
|
| 203 |
-
"output_type": "display_data"
|
| 204 |
-
},
|
| 205 |
-
{
|
| 206 |
-
"data": {
|
| 207 |
-
"text/plain": [
|
| 208 |
-
"\u001B[1m Optimizer params: \u001B[0m\u001B[38;5;34m2\u001B[0m (12.00 B)\n"
|
| 209 |
-
],
|
| 210 |
-
"text/html": [
|
| 211 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Optimizer params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">2</span> (12.00 B)\n",
|
| 212 |
-
"</pre>\n"
|
| 213 |
-
]
|
| 214 |
-
},
|
| 215 |
-
"metadata": {},
|
| 216 |
-
"output_type": "display_data"
|
| 217 |
-
},
|
| 218 |
-
{
|
| 219 |
-
"name": "stdout",
|
| 220 |
-
"output_type": "stream",
|
| 221 |
-
"text": [
|
| 222 |
-
"None\n"
|
| 223 |
-
]
|
| 224 |
-
}
|
| 225 |
-
],
|
| 226 |
-
"execution_count": null
|
| 227 |
-
},
|
| 228 |
-
{
|
| 229 |
-
"metadata": {
|
| 230 |
-
"ExecuteTime": {
|
| 231 |
-
"end_time": "2025-03-13T16:05:45.557030Z",
|
| 232 |
-
"start_time": "2025-03-13T16:05:45.541940Z"
|
| 233 |
-
}
|
| 234 |
-
},
|
| 235 |
-
"cell_type": "code",
|
| 236 |
-
"source": "import numpy",
|
| 237 |
-
"outputs": [],
|
| 238 |
-
"execution_count": 2
|
| 239 |
-
},
|
| 240 |
-
{
|
| 241 |
-
"cell_type": "markdown",
|
| 242 |
-
"metadata": {},
|
| 243 |
-
"source": [
|
| 244 |
-
"### Testing our classifier on a real image"
|
| 245 |
-
]
|
| 246 |
-
},
|
| 247 |
-
{
|
| 248 |
-
"cell_type": "code",
|
| 249 |
-
"metadata": {
|
| 250 |
-
"jupyter": {
|
| 251 |
-
"is_executing": true
|
| 252 |
-
},
|
| 253 |
-
"ExecuteTime": {
|
| 254 |
-
"start_time": "2025-03-13T16:05:47.832541Z"
|
| 255 |
-
}
|
| 256 |
-
},
|
| 257 |
-
"source": [
|
| 258 |
-
"import numpy as np\n",
|
| 259 |
-
"import cv2\n",
|
| 260 |
-
"from preprocessors import x_cord_contour, makeSquare, resize_to_pixel\n",
|
| 261 |
-
" \n",
|
| 262 |
-
"image = cv2.imread('images/numbers.jpg')\n",
|
| 263 |
-
"gray = cv2.cvtColor(image,cv2.COLOR_BGR2GRAY)\n",
|
| 264 |
-
"cv2.imshow(\"image\", image)\n",
|
| 265 |
-
"cv2.waitKey(0)\n",
|
| 266 |
-
"\n",
|
| 267 |
-
"# Blur image then find edges using Canny \n",
|
| 268 |
-
"blurred = cv2.GaussianBlur(gray, (5, 5), 0)\n",
|
| 269 |
-
"#cv2.imshow(\"blurred\", blurred)\n",
|
| 270 |
-
"#cv2.waitKey(0)\n",
|
| 271 |
-
"\n",
|
| 272 |
-
"edged = cv2.Canny(blurred, 30, 150)\n",
|
| 273 |
-
"#cv2.imshow(\"edged\", edged)\n",
|
| 274 |
-
"#cv2.waitKey(0)\n",
|
| 275 |
-
"\n",
|
| 276 |
-
"# Find Contours\n",
|
| 277 |
-
"contours, _ = cv2.findContours(edged.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n",
|
| 278 |
-
"\n",
|
| 279 |
-
"#Sort out contours left to right by using their x cordinates\n",
|
| 280 |
-
"contours = sorted(contours, key = x_cord_contour, reverse = False)\n",
|
| 281 |
-
"\n",
|
| 282 |
-
"# Create empty array to store entire number\n",
|
| 283 |
-
"full_number = []\n",
|
| 284 |
-
"\n",
|
| 285 |
-
"# loop over the contours\n",
|
| 286 |
-
"for c in contours:\n",
|
| 287 |
-
" # compute the bounding box for the rectangle\n",
|
| 288 |
-
" (x, y, w, h) = cv2.boundingRect(c) \n",
|
| 289 |
-
"\n",
|
| 290 |
-
" if w >= 5 and h >= 25:\n",
|
| 291 |
-
" roi = blurred[y:y + h, x:x + w]\n",
|
| 292 |
-
" ret, roi = cv2.threshold(roi, 127, 255,cv2.THRESH_BINARY_INV)\n",
|
| 293 |
-
" roi = makeSquare(roi)\n",
|
| 294 |
-
" roi = resize_to_pixel(28, roi)\n",
|
| 295 |
-
" cv2.imshow(\"ROI\", roi)\n",
|
| 296 |
-
" roi = roi / 255.0 \n",
|
| 297 |
-
" roi = roi.reshape(1,28,28,1) \n",
|
| 298 |
-
"\n",
|
| 299 |
-
" ## Get Prediction\n",
|
| 300 |
-
" res = str(classifier.predict_classes(roi, 1, verbose = 0)[0])\n",
|
| 301 |
-
" full_number.append(res)\n",
|
| 302 |
-
" cv2.rectangle(image, (x, y), (x + w, y + h), (0, 0, 255), 2)\n",
|
| 303 |
-
" cv2.putText(image, res, (x , y + 155), cv2.FONT_HERSHEY_COMPLEX, 2, (255, 0, 0), 2)\n",
|
| 304 |
-
" cv2.imshow(\"image\", image)\n",
|
| 305 |
-
" cv2.waitKey(0) \n",
|
| 306 |
-
" \n",
|
| 307 |
-
"cv2.destroyAllWindows()\n",
|
| 308 |
-
"print (\"The number is: \" + ''.join(full_number))"
|
| 309 |
-
],
|
| 310 |
-
"outputs": [],
|
| 311 |
-
"execution_count": null
|
| 312 |
-
},
|
| 313 |
-
{
|
| 314 |
-
"cell_type": "markdown",
|
| 315 |
-
"metadata": {},
|
| 316 |
-
"source": [
|
| 317 |
-
"### Training this Model"
|
| 318 |
-
]
|
| 319 |
-
},
|
| 320 |
-
{
|
| 321 |
-
"cell_type": "code",
|
| 322 |
-
"metadata": {
|
| 323 |
-
"ExecuteTime": {
|
| 324 |
-
"end_time": "2025-03-13T13:12:31.511869Z",
|
| 325 |
-
"start_time": "2025-03-13T12:29:08.572403Z"
|
| 326 |
-
}
|
| 327 |
-
},
|
| 328 |
-
"source": [
|
| 329 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 330 |
-
"from tensorflow.keras.utils import to_categorical\n",
|
| 331 |
-
"from tensorflow.keras.optimizers import Adadelta\n",
|
| 332 |
-
"from tensorflow.keras.datasets import mnist\n",
|
| 333 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 334 |
-
"from tensorflow.keras.layers import Dense, Dropout, Flatten\n",
|
| 335 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D ,Input\n",
|
| 336 |
-
"from tensorflow.keras import backend as K\n",
|
| 337 |
-
"\n",
|
| 338 |
-
"# Training Parameters\n",
|
| 339 |
-
"batch_size = 128\n",
|
| 340 |
-
"epochs = 20\n",
|
| 341 |
-
"\n",
|
| 342 |
-
"# loads the MNIST dataset\n",
|
| 343 |
-
"(x_train, y_train), (x_test, y_test) = mnist.load_data()\n",
|
| 344 |
-
"\n",
|
| 345 |
-
"# Lets store the number of rows and columns\n",
|
| 346 |
-
"img_rows = x_train[0].shape[0]\n",
|
| 347 |
-
"img_cols = x_train[1].shape[0]\n",
|
| 348 |
-
"\n",
|
| 349 |
-
"# Getting our date in the right 'shape' needed for Keras\n",
|
| 350 |
-
"# We need to add a 4th dimenion to our date thereby changing our\n",
|
| 351 |
-
"# Our original image shape of (60000,28,28) to (60000,28,28,1)\n",
|
| 352 |
-
"x_train = x_train.reshape(x_train.shape[0], img_rows, img_cols, 1)\n",
|
| 353 |
-
"x_test = x_test.reshape(x_test.shape[0], img_rows, img_cols, 1)\n",
|
| 354 |
-
"\n",
|
| 355 |
-
"# store the shape of a single image \n",
|
| 356 |
-
"input_shape = (img_rows, img_cols, 1)\n",
|
| 357 |
-
"\n",
|
| 358 |
-
"# change our image type to float32 data type\n",
|
| 359 |
-
"x_train = x_train.astype('float32')\n",
|
| 360 |
-
"x_test = x_test.astype('float32')\n",
|
| 361 |
-
"\n",
|
| 362 |
-
"# Normalize our data by changing the range from (0 to 255) to (0 to 1)\n",
|
| 363 |
-
"x_train /= 255\n",
|
| 364 |
-
"x_test /= 255\n",
|
| 365 |
-
"\n",
|
| 366 |
-
"print('x_train shape:', x_train.shape)\n",
|
| 367 |
-
"print(x_train.shape[0], 'train samples')\n",
|
| 368 |
-
"print(x_test.shape[0], 'test samples')\n",
|
| 369 |
-
"\n",
|
| 370 |
-
"# Now we one hot encode outputs\n",
|
| 371 |
-
"y_train = to_categorical(y_train)\n",
|
| 372 |
-
"y_test = to_categorical(y_test)\n",
|
| 373 |
-
"\n",
|
| 374 |
-
"# Let's count the number columns in our hot encoded matrix \n",
|
| 375 |
-
"print (\"Number of Classes: \" + str(y_test.shape[1]))\n",
|
| 376 |
-
"\n",
|
| 377 |
-
"num_classes = y_test.shape[1]\n",
|
| 378 |
-
"num_pixels = x_train.shape[1] * x_train.shape[2]\n",
|
| 379 |
-
"\n",
|
| 380 |
-
"# create model\n",
|
| 381 |
-
"model = Sequential()\n",
|
| 382 |
-
"\n",
|
| 383 |
-
"model.add(Input(shape=(28,28,1)))\n",
|
| 384 |
-
"model.add(Conv2D(32, kernel_size=(3, 3),\n",
|
| 385 |
-
" activation='relu',\n",
|
| 386 |
-
" input_shape=input_shape))\n",
|
| 387 |
-
"model.add(Conv2D(64, (3, 3), activation='relu'))\n",
|
| 388 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 389 |
-
"model.add(Dropout(0.25))\n",
|
| 390 |
-
"model.add(Flatten())\n",
|
| 391 |
-
"model.add(Dense(128, activation='relu'))\n",
|
| 392 |
-
"model.add(Dropout(0.5))\n",
|
| 393 |
-
"model.add(Dense(num_classes, activation='softmax'))\n",
|
| 394 |
-
"\n",
|
| 395 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 396 |
-
" optimizer = Adadelta(),\n",
|
| 397 |
-
" metrics = ['accuracy'])\n",
|
| 398 |
-
"\n",
|
| 399 |
-
"print(model.summary())\n",
|
| 400 |
-
"\n",
|
| 401 |
-
"history = model.fit(x_train, y_train,\n",
|
| 402 |
-
" batch_size=batch_size,\n",
|
| 403 |
-
" epochs=epochs,\n",
|
| 404 |
-
" verbose=1,\n",
|
| 405 |
-
" validation_data=(x_test, y_test))\n",
|
| 406 |
-
"\n",
|
| 407 |
-
"score = model.evaluate(x_test, y_test, verbose=0)\n",
|
| 408 |
-
"print('Test loss:', score[0])\n",
|
| 409 |
-
"print('Test accuracy:', score[1])"
|
| 410 |
-
],
|
| 411 |
-
"outputs": [
|
| 412 |
-
{
|
| 413 |
-
"name": "stdout",
|
| 414 |
-
"output_type": "stream",
|
| 415 |
-
"text": [
|
| 416 |
-
"x_train shape: (60000, 28, 28, 1)\n",
|
| 417 |
-
"60000 train samples\n",
|
| 418 |
-
"10000 test samples\n",
|
| 419 |
-
"Number of Classes: 10\n"
|
| 420 |
-
]
|
| 421 |
-
},
|
| 422 |
-
{
|
| 423 |
-
"data": {
|
| 424 |
-
"text/plain": [
|
| 425 |
-
"\u001B[1mModel: \"sequential_3\"\u001B[0m\n"
|
| 426 |
-
],
|
| 427 |
-
"text/html": [
|
| 428 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\">Model: \"sequential_3\"</span>\n",
|
| 429 |
-
"</pre>\n"
|
| 430 |
-
]
|
| 431 |
-
},
|
| 432 |
-
"metadata": {},
|
| 433 |
-
"output_type": "display_data"
|
| 434 |
-
},
|
| 435 |
-
{
|
| 436 |
-
"data": {
|
| 437 |
-
"text/plain": [
|
| 438 |
-
"┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
|
| 439 |
-
"┃\u001B[1m \u001B[0m\u001B[1mLayer (type) \u001B[0m\u001B[1m \u001B[0m┃\u001B[1m \u001B[0m\u001B[1mOutput Shape \u001B[0m\u001B[1m \u001B[0m┃\u001B[1m \u001B[0m\u001B[1m Param #\u001B[0m\u001B[1m \u001B[0m┃\n",
|
| 440 |
-
"┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
|
| 441 |
-
"│ conv2d_2 (\u001B[38;5;33mConv2D\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m26\u001B[0m, \u001B[38;5;34m26\u001B[0m, \u001B[38;5;34m32\u001B[0m) │ \u001B[38;5;34m320\u001B[0m │\n",
|
| 442 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 443 |
-
"│ conv2d_3 (\u001B[38;5;33mConv2D\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m24\u001B[0m, \u001B[38;5;34m24\u001B[0m, \u001B[38;5;34m64\u001B[0m) │ \u001B[38;5;34m18,496\u001B[0m │\n",
|
| 444 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 445 |
-
"│ max_pooling2d_1 (\u001B[38;5;33mMaxPooling2D\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m12\u001B[0m, \u001B[38;5;34m12\u001B[0m, \u001B[38;5;34m64\u001B[0m) │ \u001B[38;5;34m0\u001B[0m │\n",
|
| 446 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 447 |
-
"│ dropout_2 (\u001B[38;5;33mDropout\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m12\u001B[0m, \u001B[38;5;34m12\u001B[0m, \u001B[38;5;34m64\u001B[0m) │ \u001B[38;5;34m0\u001B[0m │\n",
|
| 448 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 449 |
-
"│ flatten_1 (\u001B[38;5;33mFlatten\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m9216\u001B[0m) │ \u001B[38;5;34m0\u001B[0m │\n",
|
| 450 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 451 |
-
"│ dense_2 (\u001B[38;5;33mDense\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m128\u001B[0m) │ \u001B[38;5;34m1,179,776\u001B[0m │\n",
|
| 452 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 453 |
-
"│ dropout_3 (\u001B[38;5;33mDropout\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m128\u001B[0m) │ \u001B[38;5;34m0\u001B[0m │\n",
|
| 454 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 455 |
-
"│ dense_3 (\u001B[38;5;33mDense\u001B[0m) │ (\u001B[38;5;45mNone\u001B[0m, \u001B[38;5;34m10\u001B[0m) │ \u001B[38;5;34m1,290\u001B[0m │\n",
|
| 456 |
-
"└─────────────────────────────────┴────────────────────────┴───────────────┘\n"
|
| 457 |
-
],
|
| 458 |
-
"text/html": [
|
| 459 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓\n",
|
| 460 |
-
"┃<span style=\"font-weight: bold\"> Layer (type) </span>┃<span style=\"font-weight: bold\"> Output Shape </span>┃<span style=\"font-weight: bold\"> Param # </span>┃\n",
|
| 461 |
-
"┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩\n",
|
| 462 |
-
"│ conv2d_2 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Conv2D</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">26</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">26</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">32</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">320</span> │\n",
|
| 463 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 464 |
-
"│ conv2d_3 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Conv2D</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">24</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">24</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">64</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">18,496</span> │\n",
|
| 465 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 466 |
-
"│ max_pooling2d_1 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">MaxPooling2D</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">12</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">12</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">64</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │\n",
|
| 467 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 468 |
-
"│ dropout_2 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Dropout</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">12</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">12</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">64</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │\n",
|
| 469 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 470 |
-
"│ flatten_1 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Flatten</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">9216</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │\n",
|
| 471 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 472 |
-
"│ dense_2 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Dense</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">128</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">1,179,776</span> │\n",
|
| 473 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 474 |
-
"│ dropout_3 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Dropout</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">128</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> │\n",
|
| 475 |
-
"├─────────────────────────────────┼────────────────────────┼───────────────┤\n",
|
| 476 |
-
"│ dense_3 (<span style=\"color: #0087ff; text-decoration-color: #0087ff\">Dense</span>) │ (<span style=\"color: #00d7ff; text-decoration-color: #00d7ff\">None</span>, <span style=\"color: #00af00; text-decoration-color: #00af00\">10</span>) │ <span style=\"color: #00af00; text-decoration-color: #00af00\">1,290</span> │\n",
|
| 477 |
-
"└─────────────────────────────────┴────────────────────────┴───────────────┘\n",
|
| 478 |
-
"</pre>\n"
|
| 479 |
-
]
|
| 480 |
-
},
|
| 481 |
-
"metadata": {},
|
| 482 |
-
"output_type": "display_data"
|
| 483 |
-
},
|
| 484 |
-
{
|
| 485 |
-
"data": {
|
| 486 |
-
"text/plain": [
|
| 487 |
-
"\u001B[1m Total params: \u001B[0m\u001B[38;5;34m1,199,882\u001B[0m (4.58 MB)\n"
|
| 488 |
-
],
|
| 489 |
-
"text/html": [
|
| 490 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Total params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">1,199,882</span> (4.58 MB)\n",
|
| 491 |
-
"</pre>\n"
|
| 492 |
-
]
|
| 493 |
-
},
|
| 494 |
-
"metadata": {},
|
| 495 |
-
"output_type": "display_data"
|
| 496 |
-
},
|
| 497 |
-
{
|
| 498 |
-
"data": {
|
| 499 |
-
"text/plain": [
|
| 500 |
-
"\u001B[1m Trainable params: \u001B[0m\u001B[38;5;34m1,199,882\u001B[0m (4.58 MB)\n"
|
| 501 |
-
],
|
| 502 |
-
"text/html": [
|
| 503 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Trainable params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">1,199,882</span> (4.58 MB)\n",
|
| 504 |
-
"</pre>\n"
|
| 505 |
-
]
|
| 506 |
-
},
|
| 507 |
-
"metadata": {},
|
| 508 |
-
"output_type": "display_data"
|
| 509 |
-
},
|
| 510 |
-
{
|
| 511 |
-
"data": {
|
| 512 |
-
"text/plain": [
|
| 513 |
-
"\u001B[1m Non-trainable params: \u001B[0m\u001B[38;5;34m0\u001B[0m (0.00 B)\n"
|
| 514 |
-
],
|
| 515 |
-
"text/html": [
|
| 516 |
-
"<pre style=\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\"><span style=\"font-weight: bold\"> Non-trainable params: </span><span style=\"color: #00af00; text-decoration-color: #00af00\">0</span> (0.00 B)\n",
|
| 517 |
-
"</pre>\n"
|
| 518 |
-
]
|
| 519 |
-
},
|
| 520 |
-
"metadata": {},
|
| 521 |
-
"output_type": "display_data"
|
| 522 |
-
},
|
| 523 |
-
{
|
| 524 |
-
"name": "stdout",
|
| 525 |
-
"output_type": "stream",
|
| 526 |
-
"text": [
|
| 527 |
-
"None\n",
|
| 528 |
-
"Epoch 1/20\n"
|
| 529 |
-
]
|
| 530 |
-
},
|
| 531 |
-
{
|
| 532 |
-
"name": "stderr",
|
| 533 |
-
"output_type": "stream",
|
| 534 |
-
"text": [
|
| 535 |
-
"2025-03-13 17:59:09.869544: W external/local_xla/xla/tsl/framework/cpu_allocator_impl.cc:83] Allocation of 188160000 exceeds 10% of free system memory.\n"
|
| 536 |
-
]
|
| 537 |
-
},
|
| 538 |
-
{
|
| 539 |
-
"name": "stdout",
|
| 540 |
-
"output_type": "stream",
|
| 541 |
-
"text": [
|
| 542 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m110s\u001B[0m 229ms/step - accuracy: 0.1052 - loss: 2.3000 - val_accuracy: 0.2693 - val_loss: 2.2488\n",
|
| 543 |
-
"Epoch 2/20\n",
|
| 544 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m162s\u001B[0m 271ms/step - accuracy: 0.2265 - loss: 2.2399 - val_accuracy: 0.4505 - val_loss: 2.1755\n",
|
| 545 |
-
"Epoch 3/20\n",
|
| 546 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m101s\u001B[0m 215ms/step - accuracy: 0.3401 - loss: 2.1674 - val_accuracy: 0.5460 - val_loss: 2.0756\n",
|
| 547 |
-
"Epoch 4/20\n",
|
| 548 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m98s\u001B[0m 208ms/step - accuracy: 0.4168 - loss: 2.0707 - val_accuracy: 0.6309 - val_loss: 1.9400\n",
|
| 549 |
-
"Epoch 5/20\n",
|
| 550 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m101s\u001B[0m 216ms/step - accuracy: 0.4859 - loss: 1.9388 - val_accuracy: 0.7004 - val_loss: 1.7650\n",
|
| 551 |
-
"Epoch 6/20\n",
|
| 552 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m148s\u001B[0m 228ms/step - accuracy: 0.5514 - loss: 1.7740 - val_accuracy: 0.7517 - val_loss: 1.5543\n",
|
| 553 |
-
"Epoch 7/20\n",
|
| 554 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m106s\u001B[0m 225ms/step - accuracy: 0.5989 - loss: 1.5901 - val_accuracy: 0.7860 - val_loss: 1.3342\n",
|
| 555 |
-
"Epoch 8/20\n",
|
| 556 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m141s\u001B[0m 224ms/step - accuracy: 0.6395 - loss: 1.4047 - val_accuracy: 0.8053 - val_loss: 1.1347\n",
|
| 557 |
-
"Epoch 9/20\n",
|
| 558 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m106s\u001B[0m 226ms/step - accuracy: 0.6621 - loss: 1.2518 - val_accuracy: 0.8213 - val_loss: 0.9769\n",
|
| 559 |
-
"Epoch 10/20\n",
|
| 560 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m118s\u001B[0m 252ms/step - accuracy: 0.6848 - loss: 1.1299 - val_accuracy: 0.8310 - val_loss: 0.8569\n",
|
| 561 |
-
"Epoch 11/20\n",
|
| 562 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m140s\u001B[0m 298ms/step - accuracy: 0.7070 - loss: 1.0295 - val_accuracy: 0.8393 - val_loss: 0.7659\n",
|
| 563 |
-
"Epoch 12/20\n",
|
| 564 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m139s\u001B[0m 297ms/step - accuracy: 0.7191 - loss: 0.9594 - val_accuracy: 0.8467 - val_loss: 0.6961\n",
|
| 565 |
-
"Epoch 13/20\n",
|
| 566 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m139s\u001B[0m 297ms/step - accuracy: 0.7353 - loss: 0.8954 - val_accuracy: 0.8541 - val_loss: 0.6418\n",
|
| 567 |
-
"Epoch 14/20\n",
|
| 568 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m141s\u001B[0m 301ms/step - accuracy: 0.7501 - loss: 0.8407 - val_accuracy: 0.8598 - val_loss: 0.5981\n",
|
| 569 |
-
"Epoch 15/20\n",
|
| 570 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m145s\u001B[0m 309ms/step - accuracy: 0.7608 - loss: 0.8010 - val_accuracy: 0.8637 - val_loss: 0.5631\n",
|
| 571 |
-
"Epoch 16/20\n",
|
| 572 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m140s\u001B[0m 299ms/step - accuracy: 0.7708 - loss: 0.7630 - val_accuracy: 0.8675 - val_loss: 0.5345\n",
|
| 573 |
-
"Epoch 17/20\n",
|
| 574 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m140s\u001B[0m 294ms/step - accuracy: 0.7818 - loss: 0.7294 - val_accuracy: 0.8720 - val_loss: 0.5091\n",
|
| 575 |
-
"Epoch 18/20\n",
|
| 576 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m136s\u001B[0m 290ms/step - accuracy: 0.7874 - loss: 0.7058 - val_accuracy: 0.8745 - val_loss: 0.4890\n",
|
| 577 |
-
"Epoch 19/20\n",
|
| 578 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m141s\u001B[0m 301ms/step - accuracy: 0.7911 - loss: 0.6918 - val_accuracy: 0.8777 - val_loss: 0.4711\n",
|
| 579 |
-
"Epoch 20/20\n",
|
| 580 |
-
"\u001B[1m469/469\u001B[0m \u001B[32m━━━━━━━━━━━━━━━━━━━━\u001B[0m\u001B[37m\u001B[0m \u001B[1m144s\u001B[0m 306ms/step - accuracy: 0.7922 - loss: 0.6763 - val_accuracy: 0.8803 - val_loss: 0.4562\n",
|
| 581 |
-
"Test loss: 0.4561978578567505\n",
|
| 582 |
-
"Test accuracy: 0.880299985408783\n"
|
| 583 |
-
]
|
| 584 |
-
}
|
| 585 |
-
],
|
| 586 |
-
"execution_count": 19
|
| 587 |
-
},
|
| 588 |
-
{
|
| 589 |
-
"metadata": {},
|
| 590 |
-
"cell_type": "code",
|
| 591 |
-
"outputs": [],
|
| 592 |
-
"execution_count": null,
|
| 593 |
-
"source": ""
|
| 594 |
-
},
|
| 595 |
-
{
|
| 596 |
-
"metadata": {
|
| 597 |
-
"ExecuteTime": {
|
| 598 |
-
"end_time": "2025-03-13T13:19:19.771353Z",
|
| 599 |
-
"start_time": "2025-03-13T13:19:19.412004Z"
|
| 600 |
-
}
|
| 601 |
-
},
|
| 602 |
-
"cell_type": "code",
|
| 603 |
-
"source": "model.save(\"mnist_simple_cnn.h5\")",
|
| 604 |
-
"outputs": [
|
| 605 |
-
{
|
| 606 |
-
"name": "stderr",
|
| 607 |
-
"output_type": "stream",
|
| 608 |
-
"text": [
|
| 609 |
-
"WARNING:absl:You are saving your model as an HDF5 file via `model.save()` or `keras.saving.save_model(model)`. This file format is considered legacy. We recommend using instead the native Keras format, e.g. `model.save('my_model.keras')` or `keras.saving.save_model(model, 'my_model.keras')`. \n"
|
| 610 |
-
]
|
| 611 |
-
}
|
| 612 |
-
],
|
| 613 |
-
"execution_count": 20
|
| 614 |
-
},
|
| 615 |
-
{
|
| 616 |
-
"metadata": {
|
| 617 |
-
"ExecuteTime": {
|
| 618 |
-
"end_time": "2025-03-13T13:20:22.752001Z",
|
| 619 |
-
"start_time": "2025-03-13T13:20:14.810410Z"
|
| 620 |
-
}
|
| 621 |
-
},
|
| 622 |
-
"cell_type": "code",
|
| 623 |
-
"source": "score = model.evaluate(x_test, y_test, verbose=0)\n",
|
| 624 |
-
"outputs": [],
|
| 625 |
-
"execution_count": 22
|
| 626 |
-
},
|
| 627 |
-
{
|
| 628 |
-
"metadata": {},
|
| 629 |
-
"cell_type": "code",
|
| 630 |
-
"outputs": [],
|
| 631 |
-
"execution_count": null,
|
| 632 |
-
"source": ""
|
| 633 |
-
}
|
| 634 |
-
],
|
| 635 |
-
"metadata": {
|
| 636 |
-
"kernelspec": {
|
| 637 |
-
"display_name": "Python 3 (ipykernel)",
|
| 638 |
-
"language": "python",
|
| 639 |
-
"name": "python3"
|
| 640 |
-
},
|
| 641 |
-
"language_info": {
|
| 642 |
-
"codemirror_mode": {
|
| 643 |
-
"name": "ipython",
|
| 644 |
-
"version": 3
|
| 645 |
-
},
|
| 646 |
-
"file_extension": ".py",
|
| 647 |
-
"mimetype": "text/x-python",
|
| 648 |
-
"name": "python",
|
| 649 |
-
"nbconvert_exporter": "python",
|
| 650 |
-
"pygments_lexer": "ipython3",
|
| 651 |
-
"version": "3.10.11"
|
| 652 |
-
}
|
| 653 |
-
},
|
| 654 |
-
"nbformat": 4,
|
| 655 |
-
"nbformat_minor": 2
|
| 656 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:f61bd0884c6d046620894ba4f0fdc940664827f0332829a6cd740f8e6b602873
|
| 3 |
+
size 44954
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/4.2 - Image Classifier - CIFAR10.ipynb
CHANGED
|
@@ -1,167 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### Let's run a simple image classifier, using the CIFAR 10 dataset of 10 image categories\n",
|
| 8 |
-
"* CIFAR's 10 categories:\n",
|
| 9 |
-
" * airplane\n",
|
| 10 |
-
" * automobile\n",
|
| 11 |
-
" * bird\n",
|
| 12 |
-
" * cat\n",
|
| 13 |
-
" * deer\n",
|
| 14 |
-
" * dog\n",
|
| 15 |
-
" * frog\n",
|
| 16 |
-
" * horse\n",
|
| 17 |
-
" * ship\n",
|
| 18 |
-
" * truck"
|
| 19 |
-
]
|
| 20 |
-
},
|
| 21 |
-
{
|
| 22 |
-
"cell_type": "code",
|
| 23 |
-
"metadata": {
|
| 24 |
-
"ExecuteTime": {
|
| 25 |
-
"end_time": "2025-03-13T16:11:52.999533Z",
|
| 26 |
-
"start_time": "2025-03-13T16:11:47.171043Z"
|
| 27 |
-
}
|
| 28 |
-
},
|
| 29 |
-
"source": [
|
| 30 |
-
"import cv2\n",
|
| 31 |
-
"import numpy as np\n",
|
| 32 |
-
"from tensorflow.keras.models import load_model\n",
|
| 33 |
-
"from tensorflow.keras.datasets import cifar10 \n",
|
| 34 |
-
"\n",
|
| 35 |
-
"img_row, img_height, img_depth = 32,32,3\n",
|
| 36 |
-
"\n",
|
| 37 |
-
"classifier = load_model('cifar_simple_cnn.h5')\n",
|
| 38 |
-
"\n",
|
| 39 |
-
"# Loads the CIFAR dataset\n",
|
| 40 |
-
"(x_train, y_train), (x_test, y_test) = cifar10.load_data()\n",
|
| 41 |
-
"color = True \n",
|
| 42 |
-
"scale = 8\n",
|
| 43 |
-
"\n",
|
| 44 |
-
"def draw_test(name, res, input_im, scale, img_row, img_height):\n",
|
| 45 |
-
" BLACK = [0,0,0]\n",
|
| 46 |
-
" res = int(res)\n",
|
| 47 |
-
" if res == 0:\n",
|
| 48 |
-
" pred = \"airplane\"\n",
|
| 49 |
-
" if res == 1:\n",
|
| 50 |
-
" pred = \"automobile\"\n",
|
| 51 |
-
" if res == 2:\n",
|
| 52 |
-
" pred = \"bird\"\n",
|
| 53 |
-
" if res == 3:\n",
|
| 54 |
-
" pred = \"cat\"\n",
|
| 55 |
-
" if res == 4:\n",
|
| 56 |
-
" pred = \"deer\"\n",
|
| 57 |
-
" if res == 5:\n",
|
| 58 |
-
" pred = \"dog\"\n",
|
| 59 |
-
" if res == 6:\n",
|
| 60 |
-
" pred = \"frog\"\n",
|
| 61 |
-
" if res == 7:\n",
|
| 62 |
-
" pred = \"horse\"\n",
|
| 63 |
-
" if res == 8:\n",
|
| 64 |
-
" pred = \"ship\"\n",
|
| 65 |
-
" if res == 9:\n",
|
| 66 |
-
" pred = \"truck\"\n",
|
| 67 |
-
" \n",
|
| 68 |
-
" expanded_image = cv2.copyMakeBorder(input_im, 0, 0, 0, imageL.shape[0]*2 ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 69 |
-
" if color == False:\n",
|
| 70 |
-
" expanded_image = cv2.cvtColor(expanded_image, cv2.COLOR_GRAY2BGR)\n",
|
| 71 |
-
" cv2.putText(expanded_image, str(pred), (300, 80) , cv2.FONT_HERSHEY_COMPLEX_SMALL,4, (0,255,0), 2)\n",
|
| 72 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 73 |
-
"\n",
|
| 74 |
-
"\n",
|
| 75 |
-
"for i in range(0,10):\n",
|
| 76 |
-
" rand = np.random.randint(0,len(x_test))\n",
|
| 77 |
-
" input_im = x_test[rand]\n",
|
| 78 |
-
" imageL = cv2.resize(input_im, None, fx=scale, fy=scale, interpolation = cv2.INTER_CUBIC) \n",
|
| 79 |
-
" input_im = input_im.reshape(1,img_row, img_height, img_depth) \n",
|
| 80 |
-
" \n",
|
| 81 |
-
" ## Get Prediction\n",
|
| 82 |
-
" res = str(classifier.predict_classes(input_im, 1, verbose = 0)[0])\n",
|
| 83 |
-
" \n",
|
| 84 |
-
" draw_test(\"Prediction\", res, imageL, scale, img_row, img_height) \n",
|
| 85 |
-
" cv2.waitKey(0)\n",
|
| 86 |
-
"\n",
|
| 87 |
-
"cv2.destroyAllWindows()"
|
| 88 |
-
],
|
| 89 |
-
"outputs": [
|
| 90 |
-
{
|
| 91 |
-
"name": "stderr",
|
| 92 |
-
"output_type": "stream",
|
| 93 |
-
"text": [
|
| 94 |
-
"2025-03-13 21:41:47.742478: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.\n",
|
| 95 |
-
"2025-03-13 21:41:47.746738: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.\n",
|
| 96 |
-
"2025-03-13 21:41:47.759288: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:477] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered\n",
|
| 97 |
-
"WARNING: All log messages before absl::InitializeLog() is called are written to STDERR\n",
|
| 98 |
-
"E0000 00:00:1741882307.783990 24522 cuda_dnn.cc:8310] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered\n",
|
| 99 |
-
"E0000 00:00:1741882307.790137 24522 cuda_blas.cc:1418] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered\n",
|
| 100 |
-
"2025-03-13 21:41:47.814207: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.\n",
|
| 101 |
-
"To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.\n",
|
| 102 |
-
"/home/newton/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/layers/convolutional/base_conv.py:107: UserWarning: Do not pass an `input_shape`/`input_dim` argument to a layer. When using Sequential models, prefer using an `Input(shape)` object as the first layer in the model instead.\n",
|
| 103 |
-
" super().__init__(activity_regularizer=activity_regularizer, **kwargs)\n"
|
| 104 |
-
]
|
| 105 |
-
},
|
| 106 |
-
{
|
| 107 |
-
"ename": "ValueError",
|
| 108 |
-
"evalue": "Kernel shape must have the same length as input, but received kernel of shape (3, 3, 3, 32) and input of shape (None, None, 32, 32, 3).",
|
| 109 |
-
"output_type": "error",
|
| 110 |
-
"traceback": [
|
| 111 |
-
"\u001B[0;31m---------------------------------------------------------------------------\u001B[0m",
|
| 112 |
-
"\u001B[0;31mValueError\u001B[0m Traceback (most recent call last)",
|
| 113 |
-
"Cell \u001B[0;32mIn[1], line 8\u001B[0m\n\u001B[1;32m 4\u001B[0m \u001B[38;5;28;01mfrom\u001B[39;00m \u001B[38;5;21;01mtensorflow\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01mkeras\u001B[39;00m\u001B[38;5;21;01m.\u001B[39;00m\u001B[38;5;21;01mdatasets\u001B[39;00m \u001B[38;5;28;01mimport\u001B[39;00m cifar10 \n\u001B[1;32m 6\u001B[0m img_row, img_height, img_depth \u001B[38;5;241m=\u001B[39m \u001B[38;5;241m32\u001B[39m,\u001B[38;5;241m32\u001B[39m,\u001B[38;5;241m3\u001B[39m\n\u001B[0;32m----> 8\u001B[0m classifier \u001B[38;5;241m=\u001B[39m load_model(\u001B[38;5;124m'\u001B[39m\u001B[38;5;124mcifar_simple_cnn.h5\u001B[39m\u001B[38;5;124m'\u001B[39m)\n\u001B[1;32m 10\u001B[0m \u001B[38;5;66;03m# Loads the CIFAR dataset\u001B[39;00m\n\u001B[1;32m 11\u001B[0m (x_train, y_train), (x_test, y_test) \u001B[38;5;241m=\u001B[39m cifar10\u001B[38;5;241m.\u001B[39mload_data()\n",
|
| 114 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/saving/saving_api.py:196\u001B[0m, in \u001B[0;36mload_model\u001B[0;34m(filepath, custom_objects, compile, safe_mode)\u001B[0m\n\u001B[1;32m 189\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m saving_lib\u001B[38;5;241m.\u001B[39mload_model(\n\u001B[1;32m 190\u001B[0m filepath,\n\u001B[1;32m 191\u001B[0m custom_objects\u001B[38;5;241m=\u001B[39mcustom_objects,\n\u001B[1;32m 192\u001B[0m \u001B[38;5;28mcompile\u001B[39m\u001B[38;5;241m=\u001B[39m\u001B[38;5;28mcompile\u001B[39m,\n\u001B[1;32m 193\u001B[0m safe_mode\u001B[38;5;241m=\u001B[39msafe_mode,\n\u001B[1;32m 194\u001B[0m )\n\u001B[1;32m 195\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;28mstr\u001B[39m(filepath)\u001B[38;5;241m.\u001B[39mendswith((\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124m.h5\u001B[39m\u001B[38;5;124m\"\u001B[39m, \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124m.hdf5\u001B[39m\u001B[38;5;124m\"\u001B[39m)):\n\u001B[0;32m--> 196\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m legacy_h5_format\u001B[38;5;241m.\u001B[39mload_model_from_hdf5(\n\u001B[1;32m 197\u001B[0m filepath, custom_objects\u001B[38;5;241m=\u001B[39mcustom_objects, \u001B[38;5;28mcompile\u001B[39m\u001B[38;5;241m=\u001B[39m\u001B[38;5;28mcompile\u001B[39m\n\u001B[1;32m 198\u001B[0m )\n\u001B[1;32m 199\u001B[0m \u001B[38;5;28;01melif\u001B[39;00m \u001B[38;5;28mstr\u001B[39m(filepath)\u001B[38;5;241m.\u001B[39mendswith(\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124m.keras\u001B[39m\u001B[38;5;124m\"\u001B[39m):\n\u001B[1;32m 200\u001B[0m \u001B[38;5;28;01mraise\u001B[39;00m \u001B[38;5;167;01mValueError\u001B[39;00m(\n\u001B[1;32m 201\u001B[0m \u001B[38;5;124mf\u001B[39m\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mFile not found: filepath=\u001B[39m\u001B[38;5;132;01m{\u001B[39;00mfilepath\u001B[38;5;132;01m}\u001B[39;00m\u001B[38;5;124m. \u001B[39m\u001B[38;5;124m\"\u001B[39m\n\u001B[1;32m 202\u001B[0m \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mPlease ensure the file is an accessible `.keras` \u001B[39m\u001B[38;5;124m\"\u001B[39m\n\u001B[1;32m 203\u001B[0m \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mzip file.\u001B[39m\u001B[38;5;124m\"\u001B[39m\n\u001B[1;32m 204\u001B[0m )\n",
|
| 115 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/legacy/saving/legacy_h5_format.py:133\u001B[0m, in \u001B[0;36mload_model_from_hdf5\u001B[0;34m(filepath, custom_objects, compile)\u001B[0m\n\u001B[1;32m 130\u001B[0m model_config \u001B[38;5;241m=\u001B[39m json_utils\u001B[38;5;241m.\u001B[39mdecode(model_config)\n\u001B[1;32m 132\u001B[0m \u001B[38;5;28;01mwith\u001B[39;00m saving_options\u001B[38;5;241m.\u001B[39mkeras_option_scope(use_legacy_config\u001B[38;5;241m=\u001B[39m\u001B[38;5;28;01mTrue\u001B[39;00m):\n\u001B[0;32m--> 133\u001B[0m model \u001B[38;5;241m=\u001B[39m saving_utils\u001B[38;5;241m.\u001B[39mmodel_from_config(\n\u001B[1;32m 134\u001B[0m model_config, custom_objects\u001B[38;5;241m=\u001B[39mcustom_objects\n\u001B[1;32m 135\u001B[0m )\n\u001B[1;32m 137\u001B[0m \u001B[38;5;66;03m# set weights\u001B[39;00m\n\u001B[1;32m 138\u001B[0m load_weights_from_hdf5_group(f[\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mmodel_weights\u001B[39m\u001B[38;5;124m\"\u001B[39m], model)\n",
|
| 116 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/legacy/saving/saving_utils.py:85\u001B[0m, in \u001B[0;36mmodel_from_config\u001B[0;34m(config, custom_objects)\u001B[0m\n\u001B[1;32m 81\u001B[0m \u001B[38;5;66;03m# TODO(nkovela): Swap find and replace args during Keras 3.0 release\u001B[39;00m\n\u001B[1;32m 82\u001B[0m \u001B[38;5;66;03m# Replace keras refs with keras\u001B[39;00m\n\u001B[1;32m 83\u001B[0m config \u001B[38;5;241m=\u001B[39m _find_replace_nested_dict(config, \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mkeras.\u001B[39m\u001B[38;5;124m\"\u001B[39m, \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mkeras.\u001B[39m\u001B[38;5;124m\"\u001B[39m)\n\u001B[0;32m---> 85\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m serialization\u001B[38;5;241m.\u001B[39mdeserialize_keras_object(\n\u001B[1;32m 86\u001B[0m config,\n\u001B[1;32m 87\u001B[0m module_objects\u001B[38;5;241m=\u001B[39mMODULE_OBJECTS\u001B[38;5;241m.\u001B[39mALL_OBJECTS,\n\u001B[1;32m 88\u001B[0m custom_objects\u001B[38;5;241m=\u001B[39mcustom_objects,\n\u001B[1;32m 89\u001B[0m printable_module_name\u001B[38;5;241m=\u001B[39m\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mlayer\u001B[39m\u001B[38;5;124m\"\u001B[39m,\n\u001B[1;32m 90\u001B[0m )\n",
|
| 117 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/legacy/saving/serialization.py:495\u001B[0m, in \u001B[0;36mdeserialize_keras_object\u001B[0;34m(identifier, module_objects, custom_objects, printable_module_name)\u001B[0m\n\u001B[1;32m 490\u001B[0m cls_config \u001B[38;5;241m=\u001B[39m _find_replace_nested_dict(\n\u001B[1;32m 491\u001B[0m cls_config, \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mkeras.\u001B[39m\u001B[38;5;124m\"\u001B[39m, \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mkeras.\u001B[39m\u001B[38;5;124m\"\u001B[39m\n\u001B[1;32m 492\u001B[0m )\n\u001B[1;32m 494\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mcustom_objects\u001B[39m\u001B[38;5;124m\"\u001B[39m \u001B[38;5;129;01min\u001B[39;00m arg_spec\u001B[38;5;241m.\u001B[39margs:\n\u001B[0;32m--> 495\u001B[0m deserialized_obj \u001B[38;5;241m=\u001B[39m \u001B[38;5;28mcls\u001B[39m\u001B[38;5;241m.\u001B[39mfrom_config(\n\u001B[1;32m 496\u001B[0m cls_config,\n\u001B[1;32m 497\u001B[0m custom_objects\u001B[38;5;241m=\u001B[39m{\n\u001B[1;32m 498\u001B[0m \u001B[38;5;241m*\u001B[39m\u001B[38;5;241m*\u001B[39mobject_registration\u001B[38;5;241m.\u001B[39mGLOBAL_CUSTOM_OBJECTS,\n\u001B[1;32m 499\u001B[0m \u001B[38;5;241m*\u001B[39m\u001B[38;5;241m*\u001B[39mcustom_objects,\n\u001B[1;32m 500\u001B[0m },\n\u001B[1;32m 501\u001B[0m )\n\u001B[1;32m 502\u001B[0m \u001B[38;5;28;01melse\u001B[39;00m:\n\u001B[1;32m 503\u001B[0m \u001B[38;5;28;01mwith\u001B[39;00m object_registration\u001B[38;5;241m.\u001B[39mCustomObjectScope(custom_objects):\n",
|
| 118 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/models/sequential.py:359\u001B[0m, in \u001B[0;36mSequential.from_config\u001B[0;34m(cls, config, custom_objects)\u001B[0m\n\u001B[1;32m 354\u001B[0m \u001B[38;5;28;01melse\u001B[39;00m:\n\u001B[1;32m 355\u001B[0m layer \u001B[38;5;241m=\u001B[39m serialization_lib\u001B[38;5;241m.\u001B[39mdeserialize_keras_object(\n\u001B[1;32m 356\u001B[0m layer_config,\n\u001B[1;32m 357\u001B[0m custom_objects\u001B[38;5;241m=\u001B[39mcustom_objects,\n\u001B[1;32m 358\u001B[0m )\n\u001B[0;32m--> 359\u001B[0m model\u001B[38;5;241m.\u001B[39madd(layer)\n\u001B[1;32m 360\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m (\n\u001B[1;32m 361\u001B[0m \u001B[38;5;129;01mnot\u001B[39;00m model\u001B[38;5;241m.\u001B[39m_functional\n\u001B[1;32m 362\u001B[0m \u001B[38;5;129;01mand\u001B[39;00m \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mbuild_input_shape\u001B[39m\u001B[38;5;124m\"\u001B[39m \u001B[38;5;129;01min\u001B[39;00m \u001B[38;5;28mlocals\u001B[39m()\n\u001B[1;32m 363\u001B[0m \u001B[38;5;129;01mand\u001B[39;00m build_input_shape\n\u001B[1;32m 364\u001B[0m \u001B[38;5;129;01mand\u001B[39;00m \u001B[38;5;28misinstance\u001B[39m(build_input_shape, (\u001B[38;5;28mtuple\u001B[39m, \u001B[38;5;28mlist\u001B[39m))\n\u001B[1;32m 365\u001B[0m ):\n\u001B[1;32m 366\u001B[0m model\u001B[38;5;241m.\u001B[39mbuild(build_input_shape)\n",
|
| 119 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/models/sequential.py:122\u001B[0m, in \u001B[0;36mSequential.add\u001B[0;34m(self, layer, rebuild)\u001B[0m\n\u001B[1;32m 120\u001B[0m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_layers\u001B[38;5;241m.\u001B[39mappend(layer)\n\u001B[1;32m 121\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m rebuild:\n\u001B[0;32m--> 122\u001B[0m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_maybe_rebuild()\n\u001B[1;32m 123\u001B[0m \u001B[38;5;28;01melse\u001B[39;00m:\n\u001B[1;32m 124\u001B[0m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mbuilt \u001B[38;5;241m=\u001B[39m \u001B[38;5;28;01mFalse\u001B[39;00m\n",
|
| 120 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/models/sequential.py:141\u001B[0m, in \u001B[0;36mSequential._maybe_rebuild\u001B[0;34m(self)\u001B[0m\n\u001B[1;32m 139\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;28misinstance\u001B[39m(\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_layers[\u001B[38;5;241m0\u001B[39m], InputLayer) \u001B[38;5;129;01mand\u001B[39;00m \u001B[38;5;28mlen\u001B[39m(\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_layers) \u001B[38;5;241m>\u001B[39m \u001B[38;5;241m1\u001B[39m:\n\u001B[1;32m 140\u001B[0m input_shape \u001B[38;5;241m=\u001B[39m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_layers[\u001B[38;5;241m0\u001B[39m]\u001B[38;5;241m.\u001B[39mbatch_shape\n\u001B[0;32m--> 141\u001B[0m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39mbuild(input_shape)\n\u001B[1;32m 142\u001B[0m \u001B[38;5;28;01melif\u001B[39;00m \u001B[38;5;28mhasattr\u001B[39m(\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_layers[\u001B[38;5;241m0\u001B[39m], \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124minput_shape\u001B[39m\u001B[38;5;124m\"\u001B[39m) \u001B[38;5;129;01mand\u001B[39;00m \u001B[38;5;28mlen\u001B[39m(\u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_layers) \u001B[38;5;241m>\u001B[39m \u001B[38;5;241m1\u001B[39m:\n\u001B[1;32m 143\u001B[0m \u001B[38;5;66;03m# We can build the Sequential model if the first layer has the\u001B[39;00m\n\u001B[1;32m 144\u001B[0m \u001B[38;5;66;03m# `input_shape` property. This is most commonly found in Functional\u001B[39;00m\n\u001B[1;32m 145\u001B[0m \u001B[38;5;66;03m# model.\u001B[39;00m\n\u001B[1;32m 146\u001B[0m input_shape \u001B[38;5;241m=\u001B[39m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_layers[\u001B[38;5;241m0\u001B[39m]\u001B[38;5;241m.\u001B[39minput_shape\n",
|
| 121 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/layers/layer.py:228\u001B[0m, in \u001B[0;36mLayer.__new__.<locals>.build_wrapper\u001B[0;34m(*args, **kwargs)\u001B[0m\n\u001B[1;32m 226\u001B[0m \u001B[38;5;28;01mwith\u001B[39;00m obj\u001B[38;5;241m.\u001B[39m_open_name_scope():\n\u001B[1;32m 227\u001B[0m obj\u001B[38;5;241m.\u001B[39m_path \u001B[38;5;241m=\u001B[39m current_path()\n\u001B[0;32m--> 228\u001B[0m original_build_method(\u001B[38;5;241m*\u001B[39margs, \u001B[38;5;241m*\u001B[39m\u001B[38;5;241m*\u001B[39mkwargs)\n\u001B[1;32m 229\u001B[0m \u001B[38;5;66;03m# Record build config.\u001B[39;00m\n\u001B[1;32m 230\u001B[0m signature \u001B[38;5;241m=\u001B[39m inspect\u001B[38;5;241m.\u001B[39msignature(original_build_method)\n",
|
| 122 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/models/sequential.py:187\u001B[0m, in \u001B[0;36mSequential.build\u001B[0;34m(self, input_shape)\u001B[0m\n\u001B[1;32m 185\u001B[0m \u001B[38;5;28;01mfor\u001B[39;00m layer \u001B[38;5;129;01min\u001B[39;00m \u001B[38;5;28mself\u001B[39m\u001B[38;5;241m.\u001B[39m_layers[\u001B[38;5;241m1\u001B[39m:]:\n\u001B[1;32m 186\u001B[0m \u001B[38;5;28;01mtry\u001B[39;00m:\n\u001B[0;32m--> 187\u001B[0m x \u001B[38;5;241m=\u001B[39m layer(x)\n\u001B[1;32m 188\u001B[0m \u001B[38;5;28;01mexcept\u001B[39;00m \u001B[38;5;167;01mNotImplementedError\u001B[39;00m:\n\u001B[1;32m 189\u001B[0m \u001B[38;5;66;03m# Can happen if shape inference is not implemented.\u001B[39;00m\n\u001B[1;32m 190\u001B[0m \u001B[38;5;66;03m# TODO: consider reverting inbound nodes on layers processed.\u001B[39;00m\n\u001B[1;32m 191\u001B[0m \u001B[38;5;28;01mreturn\u001B[39;00m\n",
|
| 123 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/utils/traceback_utils.py:122\u001B[0m, in \u001B[0;36mfilter_traceback.<locals>.error_handler\u001B[0;34m(*args, **kwargs)\u001B[0m\n\u001B[1;32m 119\u001B[0m filtered_tb \u001B[38;5;241m=\u001B[39m _process_traceback_frames(e\u001B[38;5;241m.\u001B[39m__traceback__)\n\u001B[1;32m 120\u001B[0m \u001B[38;5;66;03m# To get the full stack trace, call:\u001B[39;00m\n\u001B[1;32m 121\u001B[0m \u001B[38;5;66;03m# `keras.config.disable_traceback_filtering()`\u001B[39;00m\n\u001B[0;32m--> 122\u001B[0m \u001B[38;5;28;01mraise\u001B[39;00m e\u001B[38;5;241m.\u001B[39mwith_traceback(filtered_tb) \u001B[38;5;28;01mfrom\u001B[39;00m \u001B[38;5;28;01mNone\u001B[39;00m\n\u001B[1;32m 123\u001B[0m \u001B[38;5;28;01mfinally\u001B[39;00m:\n\u001B[1;32m 124\u001B[0m \u001B[38;5;28;01mdel\u001B[39;00m filtered_tb\n",
|
| 124 |
-
"File \u001B[0;32m~/miniconda3/envs/dl/lib/python3.12/site-packages/keras/src/ops/operation_utils.py:184\u001B[0m, in \u001B[0;36mcompute_conv_output_shape\u001B[0;34m(input_shape, filters, kernel_size, strides, padding, data_format, dilation_rate)\u001B[0m\n\u001B[1;32m 182\u001B[0m kernel_shape \u001B[38;5;241m=\u001B[39m kernel_size \u001B[38;5;241m+\u001B[39m (input_shape[\u001B[38;5;241m1\u001B[39m], filters)\n\u001B[1;32m 183\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;28mlen\u001B[39m(kernel_shape) \u001B[38;5;241m!=\u001B[39m \u001B[38;5;28mlen\u001B[39m(input_shape):\n\u001B[0;32m--> 184\u001B[0m \u001B[38;5;28;01mraise\u001B[39;00m \u001B[38;5;167;01mValueError\u001B[39;00m(\n\u001B[1;32m 185\u001B[0m \u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mKernel shape must have the same length as input, but received \u001B[39m\u001B[38;5;124m\"\u001B[39m\n\u001B[1;32m 186\u001B[0m \u001B[38;5;124mf\u001B[39m\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124mkernel of shape \u001B[39m\u001B[38;5;132;01m{\u001B[39;00mkernel_shape\u001B[38;5;132;01m}\u001B[39;00m\u001B[38;5;124m and \u001B[39m\u001B[38;5;124m\"\u001B[39m\n\u001B[1;32m 187\u001B[0m \u001B[38;5;124mf\u001B[39m\u001B[38;5;124m\"\u001B[39m\u001B[38;5;124minput of shape \u001B[39m\u001B[38;5;132;01m{\u001B[39;00minput_shape\u001B[38;5;132;01m}\u001B[39;00m\u001B[38;5;124m.\u001B[39m\u001B[38;5;124m\"\u001B[39m\n\u001B[1;32m 188\u001B[0m )\n\u001B[1;32m 189\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m \u001B[38;5;28misinstance\u001B[39m(dilation_rate, \u001B[38;5;28mint\u001B[39m):\n\u001B[1;32m 190\u001B[0m dilation_rate \u001B[38;5;241m=\u001B[39m (dilation_rate,) \u001B[38;5;241m*\u001B[39m \u001B[38;5;28mlen\u001B[39m(spatial_shape)\n",
|
| 125 |
-
"\u001B[0;31mValueError\u001B[0m: Kernel shape must have the same length as input, but received kernel of shape (3, 3, 3, 32) and input of shape (None, None, 32, 32, 3)."
|
| 126 |
-
]
|
| 127 |
-
}
|
| 128 |
-
],
|
| 129 |
-
"execution_count": 1
|
| 130 |
-
},
|
| 131 |
-
{
|
| 132 |
-
"cell_type": "code",
|
| 133 |
-
"execution_count": null,
|
| 134 |
-
"metadata": {},
|
| 135 |
-
"outputs": [],
|
| 136 |
-
"source": []
|
| 137 |
-
},
|
| 138 |
-
{
|
| 139 |
-
"cell_type": "code",
|
| 140 |
-
"execution_count": null,
|
| 141 |
-
"metadata": {},
|
| 142 |
-
"outputs": [],
|
| 143 |
-
"source": []
|
| 144 |
-
}
|
| 145 |
-
],
|
| 146 |
-
"metadata": {
|
| 147 |
-
"kernelspec": {
|
| 148 |
-
"display_name": "Python 3",
|
| 149 |
-
"language": "python",
|
| 150 |
-
"name": "python3"
|
| 151 |
-
},
|
| 152 |
-
"language_info": {
|
| 153 |
-
"codemirror_mode": {
|
| 154 |
-
"name": "ipython",
|
| 155 |
-
"version": 3
|
| 156 |
-
},
|
| 157 |
-
"file_extension": ".py",
|
| 158 |
-
"mimetype": "text/x-python",
|
| 159 |
-
"name": "python",
|
| 160 |
-
"nbconvert_exporter": "python",
|
| 161 |
-
"pygments_lexer": "ipython3",
|
| 162 |
-
"version": "3.7.4"
|
| 163 |
-
}
|
| 164 |
-
},
|
| 165 |
-
"nbformat": 4,
|
| 166 |
-
"nbformat_minor": 2
|
| 167 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:41e7ce99885479ae3cc66c615e4b0d505969ef050a8c31e59e2c23566340bb0f
|
| 3 |
+
size 22060
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/4.3. Live Sketching.ipynb
CHANGED
|
@@ -1,126 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "code",
|
| 5 |
-
"metadata": {
|
| 6 |
-
"ExecuteTime": {
|
| 7 |
-
"end_time": "2025-03-13T12:18:54.614963Z",
|
| 8 |
-
"start_time": "2025-03-13T12:18:52.126661Z"
|
| 9 |
-
}
|
| 10 |
-
},
|
| 11 |
-
"source": [
|
| 12 |
-
"#import keras\n",
|
| 13 |
-
"import cv2\n",
|
| 14 |
-
"import numpy as np\n",
|
| 15 |
-
"import matplotlib\n",
|
| 16 |
-
"print (cv2.__version__)"
|
| 17 |
-
],
|
| 18 |
-
"outputs": [
|
| 19 |
-
{
|
| 20 |
-
"name": "stdout",
|
| 21 |
-
"output_type": "stream",
|
| 22 |
-
"text": [
|
| 23 |
-
"4.11.0\n"
|
| 24 |
-
]
|
| 25 |
-
}
|
| 26 |
-
],
|
| 27 |
-
"execution_count": 1
|
| 28 |
-
},
|
| 29 |
-
{
|
| 30 |
-
"cell_type": "code",
|
| 31 |
-
"metadata": {
|
| 32 |
-
"ExecuteTime": {
|
| 33 |
-
"end_time": "2025-03-13T12:25:16.378140Z",
|
| 34 |
-
"start_time": "2025-03-13T12:25:15.608514Z"
|
| 35 |
-
}
|
| 36 |
-
},
|
| 37 |
-
"source": [
|
| 38 |
-
"import cv2\n",
|
| 39 |
-
"import numpy as np\n",
|
| 40 |
-
"\n",
|
| 41 |
-
"# Our sketch generating function\n",
|
| 42 |
-
"def sketch(image):\n",
|
| 43 |
-
" # Convert image to grayscale\n",
|
| 44 |
-
" img_gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)\n",
|
| 45 |
-
" \n",
|
| 46 |
-
" # Clean up image using Guassian Blur\n",
|
| 47 |
-
" img_gray_blur = cv2.GaussianBlur(img_gray, (5,5), 0)\n",
|
| 48 |
-
" \n",
|
| 49 |
-
" # Extract edges\n",
|
| 50 |
-
" canny_edges = cv2.Canny(img_gray_blur, 20, 50)\n",
|
| 51 |
-
" \n",
|
| 52 |
-
" # Do an invert binarize the image \n",
|
| 53 |
-
" ret, mask = cv2.threshold(canny_edges, 70, 255, cv2.THRESH_BINARY_INV)\n",
|
| 54 |
-
" return mask\n",
|
| 55 |
-
"\n",
|
| 56 |
-
"\n",
|
| 57 |
-
"# Initialize webcam, cap is the object provided by VideoCapture\n",
|
| 58 |
-
"# It contains a boolean indicating if it was sucessful (ret)\n",
|
| 59 |
-
"# It also contains the images collected from the webcam (frame)\n",
|
| 60 |
-
"cap = cv2.VideoCapture(2)\n",
|
| 61 |
-
"\n",
|
| 62 |
-
"while True:\n",
|
| 63 |
-
" ret, frame = cap.read()\n",
|
| 64 |
-
" cv2.imshow('Our Live Sketcher', sketch(frame))\n",
|
| 65 |
-
" if cv2.waitKey(1) == 13 or cv2.waitKey(1) == 27: #13 is the Enter Key\n",
|
| 66 |
-
" break\n",
|
| 67 |
-
" \n",
|
| 68 |
-
"# Release camera and close windows\n",
|
| 69 |
-
"cap.release()\n",
|
| 70 |
-
"cv2.destroyAllWindows() "
|
| 71 |
-
],
|
| 72 |
-
"outputs": [
|
| 73 |
-
{
|
| 74 |
-
"name": "stderr",
|
| 75 |
-
"output_type": "stream",
|
| 76 |
-
"text": [
|
| 77 |
-
"[ WARN:0@382.189] global cap_v4l.cpp:913 open VIDEOIO(V4L2:/dev/video2): can't open camera by index\n",
|
| 78 |
-
"[ERROR:0@382.862] global obsensor_uvc_stream_channel.cpp:158 getStreamChannelGroup Camera index out of range\n",
|
| 79 |
-
"[ WARN:0@382.866] global cap_v4l.cpp:803 requestBuffers VIDEOIO(V4L2:/dev/video2): failed VIDIOC_REQBUFS: errno=19 (No such device)\n"
|
| 80 |
-
]
|
| 81 |
-
},
|
| 82 |
-
{
|
| 83 |
-
"ename": "error",
|
| 84 |
-
"evalue": "OpenCV(4.11.0) /io/opencv/modules/imgproc/src/color.cpp:199: error: (-215:Assertion failed) !_src.empty() in function 'cvtColor'\n",
|
| 85 |
-
"output_type": "error",
|
| 86 |
-
"traceback": [
|
| 87 |
-
"\u001B[0;31m---------------------------------------------------------------------------\u001B[0m",
|
| 88 |
-
"\u001B[0;31merror\u001B[0m Traceback (most recent call last)",
|
| 89 |
-
"Cell \u001B[0;32mIn[6], line 27\u001B[0m\n\u001B[1;32m 25\u001B[0m \u001B[38;5;28;01mwhile\u001B[39;00m \u001B[38;5;28;01mTrue\u001B[39;00m:\n\u001B[1;32m 26\u001B[0m ret, frame \u001B[38;5;241m=\u001B[39m cap\u001B[38;5;241m.\u001B[39mread()\n\u001B[0;32m---> 27\u001B[0m cv2\u001B[38;5;241m.\u001B[39mimshow(\u001B[38;5;124m'\u001B[39m\u001B[38;5;124mOur Live Sketcher\u001B[39m\u001B[38;5;124m'\u001B[39m, sketch(frame))\n\u001B[1;32m 28\u001B[0m \u001B[38;5;28;01mif\u001B[39;00m cv2\u001B[38;5;241m.\u001B[39mwaitKey(\u001B[38;5;241m1\u001B[39m) \u001B[38;5;241m==\u001B[39m \u001B[38;5;241m13\u001B[39m \u001B[38;5;129;01mor\u001B[39;00m cv2\u001B[38;5;241m.\u001B[39mwaitKey(\u001B[38;5;241m1\u001B[39m) \u001B[38;5;241m==\u001B[39m \u001B[38;5;241m27\u001B[39m: \u001B[38;5;66;03m#13 is the Enter Key\u001B[39;00m\n\u001B[1;32m 29\u001B[0m \u001B[38;5;28;01mbreak\u001B[39;00m\n",
|
| 90 |
-
"Cell \u001B[0;32mIn[6], line 7\u001B[0m, in \u001B[0;36msketch\u001B[0;34m(image)\u001B[0m\n\u001B[1;32m 5\u001B[0m \u001B[38;5;28;01mdef\u001B[39;00m \u001B[38;5;21msketch\u001B[39m(image):\n\u001B[1;32m 6\u001B[0m \u001B[38;5;66;03m# Convert image to grayscale\u001B[39;00m\n\u001B[0;32m----> 7\u001B[0m img_gray \u001B[38;5;241m=\u001B[39m cv2\u001B[38;5;241m.\u001B[39mcvtColor(image, cv2\u001B[38;5;241m.\u001B[39mCOLOR_BGR2GRAY)\n\u001B[1;32m 9\u001B[0m \u001B[38;5;66;03m# Clean up image using Guassian Blur\u001B[39;00m\n\u001B[1;32m 10\u001B[0m img_gray_blur \u001B[38;5;241m=\u001B[39m cv2\u001B[38;5;241m.\u001B[39mGaussianBlur(img_gray, (\u001B[38;5;241m5\u001B[39m,\u001B[38;5;241m5\u001B[39m), \u001B[38;5;241m0\u001B[39m)\n",
|
| 91 |
-
"\u001B[0;31merror\u001B[0m: OpenCV(4.11.0) /io/opencv/modules/imgproc/src/color.cpp:199: error: (-215:Assertion failed) !_src.empty() in function 'cvtColor'\n"
|
| 92 |
-
]
|
| 93 |
-
}
|
| 94 |
-
],
|
| 95 |
-
"execution_count": 6
|
| 96 |
-
},
|
| 97 |
-
{
|
| 98 |
-
"cell_type": "code",
|
| 99 |
-
"execution_count": null,
|
| 100 |
-
"metadata": {},
|
| 101 |
-
"outputs": [],
|
| 102 |
-
"source": []
|
| 103 |
-
}
|
| 104 |
-
],
|
| 105 |
-
"metadata": {
|
| 106 |
-
"kernelspec": {
|
| 107 |
-
"display_name": "Python 3",
|
| 108 |
-
"language": "python",
|
| 109 |
-
"name": "python3"
|
| 110 |
-
},
|
| 111 |
-
"language_info": {
|
| 112 |
-
"codemirror_mode": {
|
| 113 |
-
"name": "ipython",
|
| 114 |
-
"version": 3
|
| 115 |
-
},
|
| 116 |
-
"file_extension": ".py",
|
| 117 |
-
"mimetype": "text/x-python",
|
| 118 |
-
"name": "python",
|
| 119 |
-
"nbconvert_exporter": "python",
|
| 120 |
-
"pygments_lexer": "ipython3",
|
| 121 |
-
"version": "3.7.4"
|
| 122 |
-
}
|
| 123 |
-
},
|
| 124 |
-
"nbformat": 4,
|
| 125 |
-
"nbformat_minor": 2
|
| 126 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:29cdd955cdffe553eb049ca1bd367135500c47b2bf6ac39c61444d935726547c
|
| 3 |
+
size 5340
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4. Get Started! Handwritting Recognition, Simple Object Classification & OpenCV Demo/Test - Imports Keras, OpenCV and tests webcam.ipynb
CHANGED
|
@@ -1,122 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "code",
|
| 5 |
-
"metadata": {
|
| 6 |
-
"ExecuteTime": {
|
| 7 |
-
"end_time": "2025-03-13T12:26:00.769321Z",
|
| 8 |
-
"start_time": "2025-03-13T12:25:51.814601Z"
|
| 9 |
-
}
|
| 10 |
-
},
|
| 11 |
-
"source": [
|
| 12 |
-
"import cv2\n",
|
| 13 |
-
"import numpy as np\n",
|
| 14 |
-
"\n",
|
| 15 |
-
"# Our sketch generating function\n",
|
| 16 |
-
"def sketch(image):\n",
|
| 17 |
-
" # Convert image to grayscale\n",
|
| 18 |
-
" img_gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)\n",
|
| 19 |
-
" \n",
|
| 20 |
-
" # Clean up image using Gaussian Blur\n",
|
| 21 |
-
" img_gray_blur = cv2.GaussianBlur(img_gray, (5,5), 0)\n",
|
| 22 |
-
" \n",
|
| 23 |
-
" # Extract edges\n",
|
| 24 |
-
" canny_edges = cv2.Canny(img_gray_blur, 10, 70)\n",
|
| 25 |
-
" \n",
|
| 26 |
-
" # Do an invert binarize the image \n",
|
| 27 |
-
" ret, mask = cv2.threshold(canny_edges, 70, 255, cv2.THRESH_BINARY_INV)\n",
|
| 28 |
-
" return mask\n",
|
| 29 |
-
"\n",
|
| 30 |
-
"\n",
|
| 31 |
-
"# Initialize webcam, cap is the object provided by VideoCapture\n",
|
| 32 |
-
"# It contains a boolean indicating if it was sucessful (ret)\n",
|
| 33 |
-
"# It also contains the images collected from the webcam (frame)\n",
|
| 34 |
-
"cap = cv2.VideoCapture(3)\n",
|
| 35 |
-
"\n",
|
| 36 |
-
"while True:\n",
|
| 37 |
-
" ret, frame = cap.read()\n",
|
| 38 |
-
" cv2.imshow('Our Live Sketcher', sketch(frame))\n",
|
| 39 |
-
" if cv2.waitKey(1) == 13: #13 is the Enter Key\n",
|
| 40 |
-
" break\n",
|
| 41 |
-
" \n",
|
| 42 |
-
"# Release camera and close windows\n",
|
| 43 |
-
"cap.release()\n",
|
| 44 |
-
"cv2.destroyAllWindows() "
|
| 45 |
-
],
|
| 46 |
-
"outputs": [],
|
| 47 |
-
"execution_count": 1
|
| 48 |
-
},
|
| 49 |
-
{
|
| 50 |
-
"cell_type": "code",
|
| 51 |
-
"metadata": {
|
| 52 |
-
"ExecuteTime": {
|
| 53 |
-
"end_time": "2025-02-24T14:10:10.316586Z",
|
| 54 |
-
"start_time": "2025-02-24T14:10:10.209728Z"
|
| 55 |
-
}
|
| 56 |
-
},
|
| 57 |
-
"source": "cap = cv2.VideoCapture(0)",
|
| 58 |
-
"outputs": [
|
| 59 |
-
{
|
| 60 |
-
"name": "stderr",
|
| 61 |
-
"output_type": "stream",
|
| 62 |
-
"text": [
|
| 63 |
-
"[ WARN:0@1028.418] global cap_v4l.cpp:913 open VIDEOIO(V4L2:/dev/video0): can't open camera by index\n",
|
| 64 |
-
"[ERROR:0@1028.522] global obsensor_uvc_stream_channel.cpp:158 getStreamChannelGroup Camera index out of range\n"
|
| 65 |
-
]
|
| 66 |
-
}
|
| 67 |
-
],
|
| 68 |
-
"execution_count": 24
|
| 69 |
-
},
|
| 70 |
-
{
|
| 71 |
-
"metadata": {
|
| 72 |
-
"ExecuteTime": {
|
| 73 |
-
"end_time": "2025-02-24T14:10:11.179764Z",
|
| 74 |
-
"start_time": "2025-02-24T14:10:11.162834Z"
|
| 75 |
-
}
|
| 76 |
-
},
|
| 77 |
-
"cell_type": "code",
|
| 78 |
-
"source": "cap.read()\n",
|
| 79 |
-
"outputs": [
|
| 80 |
-
{
|
| 81 |
-
"data": {
|
| 82 |
-
"text/plain": [
|
| 83 |
-
"(False, None)"
|
| 84 |
-
]
|
| 85 |
-
},
|
| 86 |
-
"execution_count": 25,
|
| 87 |
-
"metadata": {},
|
| 88 |
-
"output_type": "execute_result"
|
| 89 |
-
}
|
| 90 |
-
],
|
| 91 |
-
"execution_count": 25
|
| 92 |
-
},
|
| 93 |
-
{
|
| 94 |
-
"metadata": {},
|
| 95 |
-
"cell_type": "code",
|
| 96 |
-
"outputs": [],
|
| 97 |
-
"execution_count": null,
|
| 98 |
-
"source": "cv2.imshow('Our Live Sketcher', sketch(frame))"
|
| 99 |
-
}
|
| 100 |
-
],
|
| 101 |
-
"metadata": {
|
| 102 |
-
"kernelspec": {
|
| 103 |
-
"display_name": "Python 3",
|
| 104 |
-
"language": "python",
|
| 105 |
-
"name": "python3"
|
| 106 |
-
},
|
| 107 |
-
"language_info": {
|
| 108 |
-
"codemirror_mode": {
|
| 109 |
-
"name": "ipython",
|
| 110 |
-
"version": 3
|
| 111 |
-
},
|
| 112 |
-
"file_extension": ".py",
|
| 113 |
-
"mimetype": "text/x-python",
|
| 114 |
-
"name": "python",
|
| 115 |
-
"nbconvert_exporter": "python",
|
| 116 |
-
"pygments_lexer": "ipython3",
|
| 117 |
-
"version": "3.7.4"
|
| 118 |
-
}
|
| 119 |
-
},
|
| 120 |
-
"nbformat": 4,
|
| 121 |
-
"nbformat_minor": 2
|
| 122 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:198a2df703319d2e43fe0d959797cb7992dad0000bb25d6bef2655c01b417cce
|
| 3 |
+
size 3102
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8. Making a CNN in Keras/8.11 - Building a CNN for Image Classification - CIFAR10.ipynb
CHANGED
|
@@ -1,379 +1,3 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
"cell_type": "markdown",
|
| 5 |
-
"metadata": {},
|
| 6 |
-
"source": [
|
| 7 |
-
"### The CIFAR-10 Dataset\n",
|
| 8 |
-
"* Contains 10 categories of images\n",
|
| 9 |
-
" * airplane\n",
|
| 10 |
-
" * automobile\n",
|
| 11 |
-
" * bird\n",
|
| 12 |
-
" * cat\n",
|
| 13 |
-
" * deer\n",
|
| 14 |
-
" * dog\n",
|
| 15 |
-
" * frog\n",
|
| 16 |
-
" * horse\n",
|
| 17 |
-
" * ship\n",
|
| 18 |
-
" * truck"
|
| 19 |
-
]
|
| 20 |
-
},
|
| 21 |
-
{
|
| 22 |
-
"cell_type": "markdown",
|
| 23 |
-
"metadata": {},
|
| 24 |
-
"source": [
|
| 25 |
-
"### Let's Begin training out model for CIFAR-10 using a deeper CNN"
|
| 26 |
-
]
|
| 27 |
-
},
|
| 28 |
-
{
|
| 29 |
-
"cell_type": "code",
|
| 30 |
-
"execution_count": 10,
|
| 31 |
-
"metadata": {},
|
| 32 |
-
"outputs": [
|
| 33 |
-
{
|
| 34 |
-
"name": "stdout",
|
| 35 |
-
"output_type": "stream",
|
| 36 |
-
"text": [
|
| 37 |
-
"x_train shape: (50000, 32, 32, 3)\n",
|
| 38 |
-
"50000 train samples\n",
|
| 39 |
-
"10000 test samples\n",
|
| 40 |
-
"Model: \"sequential_5\"\n",
|
| 41 |
-
"_________________________________________________________________\n",
|
| 42 |
-
"Layer (type) Output Shape Param # \n",
|
| 43 |
-
"=================================================================\n",
|
| 44 |
-
"conv2d_20 (Conv2D) (None, 32, 32, 32) 896 \n",
|
| 45 |
-
"_________________________________________________________________\n",
|
| 46 |
-
"activation_30 (Activation) (None, 32, 32, 32) 0 \n",
|
| 47 |
-
"_________________________________________________________________\n",
|
| 48 |
-
"conv2d_21 (Conv2D) (None, 30, 30, 32) 9248 \n",
|
| 49 |
-
"_________________________________________________________________\n",
|
| 50 |
-
"activation_31 (Activation) (None, 30, 30, 32) 0 \n",
|
| 51 |
-
"_________________________________________________________________\n",
|
| 52 |
-
"max_pooling2d_10 (MaxPooling (None, 15, 15, 32) 0 \n",
|
| 53 |
-
"_________________________________________________________________\n",
|
| 54 |
-
"dropout_15 (Dropout) (None, 15, 15, 32) 0 \n",
|
| 55 |
-
"_________________________________________________________________\n",
|
| 56 |
-
"conv2d_22 (Conv2D) (None, 15, 15, 64) 18496 \n",
|
| 57 |
-
"_________________________________________________________________\n",
|
| 58 |
-
"activation_32 (Activation) (None, 15, 15, 64) 0 \n",
|
| 59 |
-
"_________________________________________________________________\n",
|
| 60 |
-
"conv2d_23 (Conv2D) (None, 13, 13, 64) 36928 \n",
|
| 61 |
-
"_________________________________________________________________\n",
|
| 62 |
-
"activation_33 (Activation) (None, 13, 13, 64) 0 \n",
|
| 63 |
-
"_________________________________________________________________\n",
|
| 64 |
-
"max_pooling2d_11 (MaxPooling (None, 6, 6, 64) 0 \n",
|
| 65 |
-
"_________________________________________________________________\n",
|
| 66 |
-
"dropout_16 (Dropout) (None, 6, 6, 64) 0 \n",
|
| 67 |
-
"_________________________________________________________________\n",
|
| 68 |
-
"flatten_5 (Flatten) (None, 2304) 0 \n",
|
| 69 |
-
"_________________________________________________________________\n",
|
| 70 |
-
"dense_10 (Dense) (None, 512) 1180160 \n",
|
| 71 |
-
"_________________________________________________________________\n",
|
| 72 |
-
"activation_34 (Activation) (None, 512) 0 \n",
|
| 73 |
-
"_________________________________________________________________\n",
|
| 74 |
-
"dropout_17 (Dropout) (None, 512) 0 \n",
|
| 75 |
-
"_________________________________________________________________\n",
|
| 76 |
-
"dense_11 (Dense) (None, 10) 5130 \n",
|
| 77 |
-
"_________________________________________________________________\n",
|
| 78 |
-
"activation_35 (Activation) (None, 10) 0 \n",
|
| 79 |
-
"=================================================================\n",
|
| 80 |
-
"Total params: 1,250,858\n",
|
| 81 |
-
"Trainable params: 1,250,858\n",
|
| 82 |
-
"Non-trainable params: 0\n",
|
| 83 |
-
"_________________________________________________________________\n",
|
| 84 |
-
"None\n"
|
| 85 |
-
]
|
| 86 |
-
}
|
| 87 |
-
],
|
| 88 |
-
"source": [
|
| 89 |
-
"from __future__ import print_function\n",
|
| 90 |
-
"import tensorflow as tf\n",
|
| 91 |
-
"from tensorflow.keras.datasets import cifar10\n",
|
| 92 |
-
"from tensorflow.keras.preprocessing.image import ImageDataGenerator\n",
|
| 93 |
-
"from tensorflow.keras.models import Sequential\n",
|
| 94 |
-
"from tensorflow.keras.layers import Dense, Dropout, Activation, Flatten\n",
|
| 95 |
-
"from tensorflow.keras.layers import Conv2D, MaxPooling2D\n",
|
| 96 |
-
"from tensorflow.keras.models import load_model\n",
|
| 97 |
-
"from tensorflow.keras.utils import to_categorical\n",
|
| 98 |
-
"import os\n",
|
| 99 |
-
"\n",
|
| 100 |
-
"batch_size = 32\n",
|
| 101 |
-
"num_classes = 10\n",
|
| 102 |
-
"epochs = 1\n",
|
| 103 |
-
"\n",
|
| 104 |
-
"# Loads the CIFAR dataset\n",
|
| 105 |
-
"(x_train, y_train), (x_test, y_test) = cifar10.load_data()\n",
|
| 106 |
-
"\n",
|
| 107 |
-
"# Display our data shape/dimensions\n",
|
| 108 |
-
"print('x_train shape:', x_train.shape)\n",
|
| 109 |
-
"print(x_train.shape[0], 'train samples')\n",
|
| 110 |
-
"print(x_test.shape[0], 'test samples')\n",
|
| 111 |
-
"\n",
|
| 112 |
-
"# Format our training data by Normalizing and changing data type\n",
|
| 113 |
-
"x_train = x_train.astype('float32')\n",
|
| 114 |
-
"x_test = x_test.astype('float32')\n",
|
| 115 |
-
"x_train /= 255\n",
|
| 116 |
-
"x_test /= 255\n",
|
| 117 |
-
"\n",
|
| 118 |
-
"# Now we one hot encode outputs\n",
|
| 119 |
-
"y_train = to_categorical(y_train)\n",
|
| 120 |
-
"y_test = to_categorical(y_test)\n",
|
| 121 |
-
"\n",
|
| 122 |
-
"model = Sequential()\n",
|
| 123 |
-
"# Padding = 'same' results in padding the input such that\n",
|
| 124 |
-
"# the output has the same length as the original input\n",
|
| 125 |
-
"model.add(Conv2D(32, (3, 3), padding='same',\n",
|
| 126 |
-
" input_shape=x_train.shape[1:]))\n",
|
| 127 |
-
"model.add(Activation('relu'))\n",
|
| 128 |
-
"model.add(Conv2D(32, (3, 3)))\n",
|
| 129 |
-
"model.add(Activation('relu'))\n",
|
| 130 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 131 |
-
"model.add(Dropout(0.25))\n",
|
| 132 |
-
"\n",
|
| 133 |
-
"model.add(Conv2D(64, (3, 3), padding='same'))\n",
|
| 134 |
-
"model.add(Activation('relu'))\n",
|
| 135 |
-
"model.add(Conv2D(64, (3, 3)))\n",
|
| 136 |
-
"model.add(Activation('relu'))\n",
|
| 137 |
-
"model.add(MaxPooling2D(pool_size=(2, 2)))\n",
|
| 138 |
-
"model.add(Dropout(0.25))\n",
|
| 139 |
-
"\n",
|
| 140 |
-
"model.add(Flatten())\n",
|
| 141 |
-
"model.add(Dense(512))\n",
|
| 142 |
-
"model.add(Activation('relu'))\n",
|
| 143 |
-
"model.add(Dropout(0.5))\n",
|
| 144 |
-
"model.add(Dense(num_classes))\n",
|
| 145 |
-
"model.add(Activation('softmax'))\n",
|
| 146 |
-
"\n",
|
| 147 |
-
"# initiate RMSprop optimizer and configure some parameters\n",
|
| 148 |
-
"opt = tf.keras.optimizers.RMSprop(lr=0.0001, decay=1e-6)\n",
|
| 149 |
-
"\n",
|
| 150 |
-
"# Let's create our model\n",
|
| 151 |
-
"model.compile(loss = 'categorical_crossentropy',\n",
|
| 152 |
-
" optimizer = opt,\n",
|
| 153 |
-
" metrics = ['accuracy'])\n",
|
| 154 |
-
"\n",
|
| 155 |
-
"print(model.summary())"
|
| 156 |
-
]
|
| 157 |
-
},
|
| 158 |
-
{
|
| 159 |
-
"cell_type": "markdown",
|
| 160 |
-
"metadata": {},
|
| 161 |
-
"source": [
|
| 162 |
-
"### Training Our Model"
|
| 163 |
-
]
|
| 164 |
-
},
|
| 165 |
-
{
|
| 166 |
-
"cell_type": "code",
|
| 167 |
-
"execution_count": 12,
|
| 168 |
-
"metadata": {},
|
| 169 |
-
"outputs": [
|
| 170 |
-
{
|
| 171 |
-
"name": "stdout",
|
| 172 |
-
"output_type": "stream",
|
| 173 |
-
"text": [
|
| 174 |
-
"Train on 50000 samples, validate on 10000 samples\n",
|
| 175 |
-
"50000/50000 [==============================] - 174s 3ms/sample - loss: 1.4921 - accuracy: 0.4595 - val_loss: 1.5935 - val_accuracy: 0.4478\n",
|
| 176 |
-
"10000/10000 [==============================] - 5s 490us/sample - loss: 1.5935 - accuracy: 0.4478\n",
|
| 177 |
-
"Test loss: 1.5935393712997437\n",
|
| 178 |
-
"Test accuracy: 0.4478\n"
|
| 179 |
-
]
|
| 180 |
-
}
|
| 181 |
-
],
|
| 182 |
-
"source": [
|
| 183 |
-
"history = model.fit(x_train, y_train,\n",
|
| 184 |
-
" batch_size=batch_size,\n",
|
| 185 |
-
" epochs=epochs,\n",
|
| 186 |
-
" validation_data=(x_test, y_test),\n",
|
| 187 |
-
" shuffle=True)\n",
|
| 188 |
-
"\n",
|
| 189 |
-
"model.save(\"cifar_simple_cnn_2.h5\")\n",
|
| 190 |
-
"\n",
|
| 191 |
-
"# Evaluate the performance of our trained model\n",
|
| 192 |
-
"scores = model.evaluate(x_test, y_test, verbose=1)\n",
|
| 193 |
-
"print('Test loss:', scores[0])\n",
|
| 194 |
-
"print('Test accuracy:', scores[1])"
|
| 195 |
-
]
|
| 196 |
-
},
|
| 197 |
-
{
|
| 198 |
-
"cell_type": "markdown",
|
| 199 |
-
"metadata": {},
|
| 200 |
-
"source": [
|
| 201 |
-
"### Plotting our Accuracy and Loss Charts"
|
| 202 |
-
]
|
| 203 |
-
},
|
| 204 |
-
{
|
| 205 |
-
"cell_type": "code",
|
| 206 |
-
"execution_count": 15,
|
| 207 |
-
"metadata": {},
|
| 208 |
-
"outputs": [
|
| 209 |
-
{
|
| 210 |
-
"data": {
|
| 211 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYgAAAEGCAYAAAB/+QKOAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAevklEQVR4nO3de3RV5bnv8e9jDKKEm1BBpTVYqJVgCCEEUZQgFhUrrcopUG842mLRVi3H3aLDasu2Z3tat4KipWhB3SgMq/WyQRBLyQZaL4ACcjkKUlQu5WYFghEJPOePtYgR3iQrl7lmQn6fMeZwZb7z8ryLmN9655xrTnN3REREDndM3AWIiEjDpIAQEZEgBYSIiAQpIEREJEgBISIiQcfGXUB9at++vWdnZ8ddRo3s3buXFi1axF1GWqnPTYP63DgsXbp0h7t/JdR2VAVEdnY2S5YsibuMGikuLqaoqCjuMtJKfW4a1OfGwcw+qKxNh5hERCRIASEiIkEKCBERCTqqzkGINFTPr/2cdBya3r9/Pxs3buSzzz6LfmfVaN26NWvWrIm7jLRqyH1u3rw5nTp1IjMzM+V1FBAiafDi+/uZkIb9bNy4kZYtW5KdnY2ZpWGPlduzZw8tW7aMtYZ0a6h9dnd27tzJxo0b6dy5c8rr6RCTyFHks88+o127drGHgzQsZka7du1qPLJUQIgcZRQOElKb3wsFhIjwwKvvxV2CNEAKCJF69sCr75E9dtaXJuCIeQ3pj/KEeWvrZTtFRUW88sorX5o3fvx4brzxxirXy8rKAmDz5s0MHTq00m1X90XY8ePH8+mnn5b/PHjwYD755JNUSg967bXX6Ny5M3l5eeTl5ZGVlcUZZ5xBXl4e1157bY22dfDgQe69995K2zt16lSnWiPh7kfN1KtXL29s5s+fH3cJadcU+3zaL2amZT+rV6+u1Xr1Vd+kSZN85MiR7u6+e/dud3fv06ePL1iwoMr1WrRoUe22+/fv74sXL65ymdNOO823b9+eYrXVu+uuu/zZZ59NuYZDfQ7Zv3+/t27dutL2U0891f/1r3/VrtAUhX4/gCVeyd9UjSBEpN4MHTqUmTNnsm/fPgA2bNjA5s2b6devHyUlJQwcOJD8/HzOOussXnzxxSPW37BhA927dwegtLSU4cOHk5uby7BhwygtLS1fbvTo0RQUFJCTk8Pdd98NwIMPPsjmzZsZMGAAAwYMABK339mxYwcA999/P927d6d79+6MHz++fH9nnnkmP/rRj8jJyWHQoEFf2s+8efO48MILK+1vWVkZY8aMobCwkNzcXJ544gkANm3aRL9+/cjLy6N79+78/e9/Z+zYsezZs6dGo48dO3YwZMgQcnNzOeecc1i5ciUAf/3rX+nRowd5eXnk5+ezd+/e4D7rSpe5ihylDh3aqu/lN9x7aaVt7dq1o7CwkDlz5nDBBRcwY8YMhg0bhpnRvHlznn/+eVq1asWOHTs4++yzGTJkSKUnT3//+99zwgknsGLFClasWEF+fn55229+8xtOPPFEDhw4wMCBA1mxYgU333wz999/P/Pnz6d9+/Zf2tbSpUuZOnUqb7zxBu5Onz596N+/P23btmXt2rVMnz6dRx99lO9973s899xzXH311ezYsYPMzExat25daX8nT57MSSedxJtvvsm+ffsoLCxkyJAhTJ8+ncsuu4xf/OIXHDhwgNLSUgoLC3nsscdYtmxZSu8zwC9/+Uv69OnDSy+9xNy5cxk5ciRLlizhd7/7HZMnT6ZPnz6UlJTQvHlzpk2bdsQ+60ojCBGpVyNGjGDGjBkAzJgxgxEjRgCJw9l33HEHubm5XHjhhWzatImtW7dWup0FCxZw9dVXA5Cbm0tubm552zPPPEN+fj49e/Zk1apVrF69usqaFi1axOWXX06LFi3IysriiiuuYOHChQDl5xgAevXqxYYNGwCYO3cugwYNqnK7c+fOZerUqeTl5dGnTx927drF2rVr6d27N4899hi//vWvWblyZfk5lppatGgR11xzDQCDBg1i8+bN7N27l3PPPZdbb72Vhx56iN27d5ORkVFv+6xIIwiRNPjO11P/9mp9qeqT/uGyx86q0fJV+e53v8uYMWNYtmwZpaWl5Z/8n3rqKbZv387SpUvJzMwkOzu72uvyQ6OLf/zjH9x3330sXryYtm3bMnLkyGq3kzjUHnbccceVv87IyCj/5D179mzGjBlT7XYfeeQRBg4cCHz5i3LFxcXMmjWLq666ittvv51hw4ZVua1U6j7085133smQIUOYNWsWvXv3pri4mAsuuOCIfV511VU13mdFGkGIpMHlXZvFXULaZGVlUVRUxE033VQ+egDYtWsXJ510EpmZmcyfP58PPqj0LtMAnH/++Tz11FMArFy5khUrVgCwe/duWrRoQevWrdm6dSuzZ88uX6dly5bs2bMnuK0XXniBTz/9lL179/L8889z3nnnVbpvd2fFihXlI4vKXHTRRTzyyCOUlZUBsHbtWkpLS/nggw/o2LEjo0aNYuTIkbz99tsce2zi8/ihZVNR8T34y1/+QqdOnWjRogXvv/8+ubm53H777fTs2ZN33303uM+60ghCROrdiBEjuOKKK3jmmWfK51111VVcdtllFBQUkJeXxze/+c0qtzF69Giuv/56cnNzycvLo7CwEIAePXrQs2dPcnJyOP300zn33HPL1xk1ahSXXHIJJ598MvPnzy+fn5+fz8iRI8u38cMf/pCePXuWH0463NKlS+nZs2e1Xy674YYb+PDDD8uDpF27dsycOZN58+Zx//33k5mZSVZWFtOmTQPgBz/4Abm5uRQUFPDkk08esb2cnJzyfX7/+99n3Lhx5e9BVlYWU6dOBeC+++5j4cKFHHPMMeTm5jJo0CCmTZsW3GddWFVDr8amoKDA9cCghk99js6aNWs488wza7xefR5iOqSh3pcoFffccw9dunRh+PDhNVqvofc59PthZkvdvSC0vEYQIsItA7vGXUKDcuedd8ZdQoOgcxAiws++9Y24S5AGSAEhIiJBCggREQlSQIiISJACQkREghQQIk3ZogdgT+W3u6ipnTt3lt8au0uXLpx66qnlP3/++ecpbeP666/n3XffrXKZhx9+uPwLZHXVr1+/Gt0fqSnRZa4iTVnJNvjbBLj4/9TL5tq1a1f+x/b222+nXbt23HbbbV9apvxW0seEP58e+jJYVW666aa6FyvV0ghCpCk79xZY/nS9jiJC1q1bR/fu3fnxj39Mfn4+W7ZsYdSoUeW37B43blz5soc+0ZeVldGmTRvGjh1Ljx496Nu3L9u2bQMS31M4dMvufv36MXbsWAoLCznjjDPKb3O9d+9errzySnr06MGIESMoKChIeaRQWlrKddddx1lnnUV+fj4LFiwA4J133qF3797k5eWRm5vL+vXr2bNnD5dccgk9evSgT58+PPvss/X51sVKIwiRo9WvKr9N9RH+swbfg/jVrprXAqxevZqpU6cyadIkAO69915OPPFEysrKGDBgAEOHDqVbt25fWmfXrl3079+fe++9lzFjxjBlyhTGjh17xLbdnTfffJOXXnqJcePGMWfOHB566CE6duzIc889x/Lly790u/DqPPjggzRr1ox33nmHVatWMXjwYNauXcsjjzzCbbfdxrBhw9i3bx/uzosvvkh2djazZ89mz549HDx4sFbvT0OkEYSIpMXXv/51evfuXf7z9OnTyc/PJz8/nzVr1gRv2X388cdzySWXAF++FffhrrjiiiOWWbRoUfmtMnr06EFOTk7KtVa8zXZOTg6nnHIK69at45xzzuGee+7ht7/9LR999BHNmzcnNzeXOXPmMHbsWF5//fUqnx/R2GgEIXK0SuWT/ud7YfIA6Hcr5H0/0nJatGhR/nrt2rVMmDCBN998kzZt2nD11VcHb9ndrNkXd8HNyMio9E6oh27ZXXGZutxnrrJ1r7nmGvr27cusWbP41re+xRNPPMH555/PkiVLePnll7nzzjtZvHgxd9xxR6333ZBoBCHSlM26DToVRB4Oh9u9ezctW7akVatWbNmyhVdeeaXe99GvX7/yu8m+88471T5UqKKKt9les2YNW7ZsoUuXLqxfv54uXbpwyy23cOmll7JixQo2bdpEVlYW11xzDT/5yU9466236r0vcdEIQqSpevsp2PwW/Oivad91fn4+3bp1o3v37kfcsru+/PSnP+Xaa68lNzeX/Px8unfvXunhn4suuojMzMRDnc477zymTJnCDTfcwFlnnUVmZiZPPvkkzZo14+mnn2b69OlkZmZyyimncM8995Q/b/qYY44hIyODRx99tN77EptDl5zV9wRMAbYBKytpLwJ2AcuS010V2n4GrAJWAtOB5qnss1evXt7YzJ8/P+4S0k59js7q1atTX3jhA+5ba7B8De3evTuybadi//79Xlpa6u7u7733nmdnZ/v+/fsj3Wfcfa5O6PcDWOKV/E2NcgTxODAROPKpGF9Y6O7frjjDzE4Fbga6uXupmT0DDE9uT0TqS79b464gUiUlJQwcOJCysjLcnT/84Q/lT3WT1ET2brn7AjPLruXqxwLHm9l+4ARgc33VJSJNQ5s2bVi6dGncZTRqcZ+k7mtmy81stpnlALj7JuA+4ENgC7DL3efGWaRIY+JH0VMipf7U5vci0keOJkcQM929e6CtFXDQ3UvMbDAwwd27mllb4DlgGPAJ8CfgWXcPPmDVzEYBowA6dOjQa8aMGZH0JSolJSVkZWXFXUZaqc/RycrKokOHDrRu3bra5ylH7cCBA2RkZMRaQ7o11D67O7t27WLr1q2UlJR8qW3AgAGVPnI0toAILLsBKAAGABe7+w+S868Fznb3G6vbhp5J3Tioz9HZv38/GzduDH6nIN0+++wzmjdvHncZadWQ+9y8eXM6depUfrXWIQ3ymdRm1hHY6u5uZoUkDnftJHFo6WwzOwEoBQYCjeuvvkhMMjMz6dy5c9xlAIlQ7NmzZ9xlpNXR1ufIAsLMppO4lLW9mW0E7gYyAdx9EjAUGG1mZSSCYHjykqs3zOxZ4C2gDHgbmBxVnSIiEhblVUwjqmmfSOIy2FDb3SQCRUREYhL3VUwiItJAKSBERCRIASEiIkEKCBERCVJAiIhIkAJCRESCFBAiIhKkgBARkSAFhIiIBCkgREQkSAEhIiJBCggREQlSQIiISJACQkREghQQIiISpIAQEZEgBYSIiAQpIEREJEgBISIiQQoIEREJUkCIiEiQAkJERIIUECIiEqSAEBGRIAWEiIgEKSBERCRIASEiIkEKCBERCVJAiIhIkAJCRESCIgsIM5tiZtvMbGUl7UVmtsvMliWnuyq0tTGzZ83s/5nZGjPrG1WdIiISdmyE234cmAg8WcUyC93924H5E4A57j7UzJoBJ0RQn4iIVCGyEYS7LwA+rul6ZtYKOB/4Y3I7n7v7J/VcnoiIVCPucxB9zWy5mc02s5zkvNOB7cBUM3vbzB4zsxYx1igi0iSZu0e3cbNsYKa7dw+0tQIOunuJmQ0GJrh7VzMrAF4HznX3N8xsArDb3X9ZyT5GAaMAOnTo0GvGjBkR9SYaJSUlZGVlxV1GWqnPTYP63DgMGDBgqbsXhNpiC4jAshuAAhLnRV539+zk/POAse5+aXXbKCgo8CVLltSh4vQrLi6mqKgo7jLSSn1uGtTnxsHMKg2I2A4xmVlHM7Pk68JkLTvd/Z/AR2Z2RnLRgcDqmMoUEWmyIruKycymA0VAezPbCNwNZAK4+yRgKDDazMqAUmC4fzGc+SnwVPIKpvXA9VHVKSIiYZEFhLuPqKZ9IonLYENty0gcbhIRkZjEfRWTiIg0UAoIEREJUkCIiEiQAkJERIIUECIiEqSAEBGRIAWEiIgEKSBERCRIASEiIkEKCBERCVJAiIhIkAJCRESCFBAiIhKkgBARkSAFhIiIBCkgREQkSAEhIiJBCggREQlSQIiISJACQkREglIKCDP7upkdl3xdZGY3m1mbaEsTEZE4pTqCeA44YGZdgD8CnYGnI6tKRERil2pAHHT3MuByYLy7/ww4ObqyREQkbqkGxH4zGwFcB8xMzsuMpiQREWkIUg2I64G+wG/c/R9m1hmYFl1ZIiISt2NTWcjdVwM3A5hZW6Clu98bZWEiIhKvVK9iKjazVmZ2IrAcmGpm90dbmoiIxCnVQ0yt3X03cAUw1d17ARdGV5aIiMQt1YA41sxOBr7HFyepRUTkKJZqQIwDXgHed/fFZnY6sDa6skREJG4pBYS7/8ndc919dPLn9e5+ZVXrmNkUM9tmZisraS8ys11mtiw53XVYe4aZvW1mGrGIiMQg1ZPUnczs+eQf/K1m9pyZdapmtceBi6tZZqG75yWncYe13QKsSaU+ERGpf6keYpoKvAScApwK/HdyXqXcfQHwcW2KSobPpcBjtVlfRETqLtWA+Iq7T3X3suT0OPCVeth/XzNbbmazzSynwvzxwM+Bg/WwDxERqYWUvigH7DCzq4HpyZ9HADvruO+3gNPcvcTMBgMvAF3N7NvANndfamZF1W3EzEYBowA6dOhAcXFxHctKr5KSkkZXc12pz02D+nwUcPdqJ+BrJA4xbQe2kfhj/rUU1ssGVqa4jw1Ae+A/gI3Jn/8JfApMS2UbvXr18sZm/vz5cZeQdupz06A+Nw7AEq/kb2qqVzF96O5D3P0r7n6Su3+XxJfmas3MOpqZJV8XkjjctdPdb3f3Tu6eDQwH/uruV9dlXyIiUnOpHmIKGUPiXEGQmU0HioD2ZrYRuJvkHWDdfRIwFBhtZmVAKTA8mWYiItIA1CUgrKpGdx9RTftEYGI1yxQDxTUtTERE6q4uz6TWp30RkaNYlSMIM9tDOAgMOD6SikREpEGoMiDcvWW6ChERkYalLoeYRETkKKaAEBGRIAWEiIgEKSBERCRIASEiIkEKCBERCVJAiIhIkAJCRESCFBAiIhKkgBARkSAFhIiIBCkgREQkSAEhIiJBCggREQlSQIiISJACQkREghQQIiISpIAQEZEgBYSIiAQpIEREJEgBISIiQQoIEREJUkCIiEiQAkJERIIUECIiEqSAEBGRIAWEiIgERRYQZjbFzLaZ2cpK2ovMbJeZLUtOdyXnf9XM5pvZGjNbZWa3RFWjiIhU7tgIt/04MBF4soplFrr7tw+bVwb8b3d/y8xaAkvN7FV3Xx1RnSIiEhDZCMLdFwAf12K9Le7+VvL1HmANcGo9lyciItWI+xxEXzNbbmazzSzn8EYzywZ6Am+kuzARkabO3D26jSf+wM909+6BtlbAQXcvMbPBwAR371qhPQv4H+A37v7nKvYxChgF0KFDh14zZsyo305ErKSkhKysrLjLSCv1uWlQnxuHAQMGLHX3glBbbAERWHYDUODuO8wsE5gJvOLu96e6v4KCAl+yZEktq41HcXExRUVFcZeRVupz06A+Nw5mVmlAxHaIycw6mpklXxcma9mZnPdHYE1NwkFEROpXZFcxmdl0oAhob2YbgbuBTAB3nwQMBUabWRlQCgx3dzezfsA1wDtmtiy5uTvc/eWoahURkSNFFhDuPqKa9okkLoM9fP4iwKKqS0REUhP3VUwiItJAKSBERCRIASEiIkEKCBERCVJAiIhIkAJCRESCFBAiIhKkgBARkSAFhIiIBCkgREQkSAEhIiJBCggREQlSQIiISJACQkREghQQIiISpIAQEZEgBYSIiAQpIEREJEgBISIiQQoIEREJUkCIiEiQAkJERIIUECIiEqSAEBGRIAWEiIgEKSBERCRIASEiIkEKCBERCVJAiIhIkAJCRESCIgsIM5tiZtvMbGUl7UVmtsvMliWnuyq0XWxm75rZOjMbG1WNIiJSuShHEI8DF1ezzEJ3z0tO4wDMLAN4GLgE6AaMMLNuEdYpIiIBkQWEuy8APq7FqoXAOndf7+6fAzOA79RrcSIiUq24z0H0NbPlZjbbzHKS804FPqqwzMbkPBERSaNjY9z3W8Bp7l5iZoOBF4CugAWW9co2YmajgFEAHTp0oLi4OIJSo1NSUtLoaq4r9blpUJ8bv9gCwt13V3j9spk9YmbtSYwYvlph0U7A5iq2MxmYDFBQUOBFRUXRFByR4uJiGlvNdaU+Nw3qc+MX2yEmM+toZpZ8XZisZSewGOhqZp3NrBkwHHgprjpFRJqqyEYQZjYdKALam9lG4G4gE8DdJwFDgdFmVgaUAsPd3YEyM/sJ8AqQAUxx91VR1SkiImGRBYS7j6imfSIwsZK2l4GXo6hLRERSE/dVTCIi0kApIEREJEgBISIiQQoIEREJUkCIiEiQAkJERIIUECJRWvQA7NkadxUitaKAEIlSyTb424S4qxCpFQWESJTOvQWWP02zff+KuxKRGlNAiESpZUfoMYKvfvTnuCsRqbE4b/ct0jR84yK++vp34Fetvzy//1gYcHs8NYmkQAEhEqXP98LLP2fNN2/hzOHj4q5GpEZ0iEkkSrNug04FbO14QdyViNSYAkIkKm8/BZvfgsG/i7sSkVpRQIhEZe92+F+PQ7MWcVciUis6ByESlX63xl2BSJ1oBCEiIkEKCBERCVJAiIhIkAJCRESCzN3jrqHemNl24IO466ih9sCOuItIM/W5aVCfG4fT3P0roYajKiAaIzNb4u4FcdeRTupz06A+N346xCQiIkEKCBERCVJAxG9y3AXEQH1uGtTnRk7nIEREJEgjCBERCVJAiIhIkAIiQmZ2sZm9a2brzGxsoP00M5tnZivMrNjMOlVo+5qZzTWzNWa22syy01l7bdWxz781s1XJPj9oZpbe6mvOzKaY2TYzW1lJuyX7si7Z5/wKbdeZ2drkdF36qq6b2vbZzPLM7LXkv/EKMxuW3sprry7/zsn2Vma2ycwmpqfieuLumiKYgAzgfeB0oBmwHOh22DJ/Aq5Lvr4A+K8KbcXAt5Kvs4AT4u5TlH0GzgH+ltxGBvAaUBR3n1Lo8/lAPrCykvbBwGzAgLOBN5LzTwTWJ//bNvm6bdz9ibjP3wC6Jl+fAmwB2sTdnyj7XKF9AvA0MDHuvtRk0ggiOoXAOndf7+6fAzOA7xy2TDdgXvL1/EPtZtYNONbdXwVw9xJ3/zQ9ZddJrfsMONCcRLAcB2QCWyOvuI7cfQHwcRWLfAd40hNeB9qY2cnARcCr7v6xu/8LeBW4OPqK6662fXb399x9bXIbm4FtQPAbvA1NHf6dMbNeQAdgbvSV1i8FRHROBT6q8PPG5LyKlgNXJl9fDrQ0s3YkPml9YmZ/NrO3zex3ZpYRecV1V+s+u/trJAJjS3J6xd3XRFxvOlT2nqTyXjVW1fbNzApJfBh4P411RSnYZzM7BvhP4N9iqaqOFBDRCR0/P/ya4tuA/mb2NtAf2ASUkXiQ03nJ9t4kDtmMjKzS+lPrPptZF+BMoBOJ/9kuMLPzoyw2TSp7T1J5rxqrKvuW/GT9X8D17n4wbVVFq7I+3wi87O4fBdobPD1RLjobga9W+LkTsLniAslh9hUAZpYFXOnuu8xsI/C2u69Ptr1A4rjmH9NReB3Upc+jgNfdvSTZNptEnxeko/AIVfaebASKDptfnLaqolXp74GZtQJmAXcmD8UcLSrrc1/gPDO7kcS5xGZmVuLuR1zA0RBpBBGdxUBXM+tsZs2A4cBLFRcws/bJISjA7cCUCuu2NbNDx2cvAFanoea6qkufPyQxsjjWzDJJjC6OhkNMLwHXJq9yORvY5e5bgFeAQWbW1szaAoOS844GwT4nfyeeJ3Gs/k/xlljvgn1296vc/Wvunk1i9PxkYwkH0AgiMu5eZmY/IfE/fQYwxd1Xmdk4YIm7v0TiE+R/mJmT+KR8U3LdA2Z2GzAveannUuDROPpRE3XpM/AsiSB8h8TQfI67/3e6+1BTZjadRJ/aJ0d+d5M4wY67TwJeJnGFyzrgU+D6ZNvHZvbvJEIVYJy7V3UStMGobZ+B75G4GqidmY1Mzhvp7svSVnwt1aHPjZputSEiIkE6xCQiIkEKCBERCVJAiIhIkAJCRESCFBAiIhKkgBCphpkdMLNlFaZ6u47dzLIru0OoSNz0PQiR6pW6e17cRYikm0YQIrVkZhvM7P+a2ZvJqUtyfsVnXswzs68l53cws+fNbHlyOie5qQwzezT5nIS5ZnZ8cvmbLfEskBVmNiOmbkoTpoAQqd7xhx1iqvigm93uXghMBMYn500kcUuFXOAp4MHk/AeB/3H3HiSeLbAqOb8r8LC75wCf8MXdbscCPZPb+XFUnROpjL5JLVKN5M3VsgLzNwAXuPv65P2j/unu7cxsB3Cyu+9Pzt/i7u3NbDvQyd33VdhGNonnQnRN/vwLINPd7zGzOUAJ8ALwwqEbGYqki0YQInXjlbyubJmQfRVeH+CLc4OXAg8DvYClZqZzhpJWCgiRuhlW4b+vJV//ncSdbAGuAhYlX88DRgOYWUby1tdByTveftXd5wM/B9qQuF20SNroE4lI9Y43s4p3HJ1T4ZbNx5nZGyQ+bI1IzrsZmGJm/wZs54s7e94CTDazH5AYKYwm8fS8kAxgmpm1JvEwmgfc/ZN665FICnQOQqSWkucgCtx9R9y1iERBh5hERCRIIwgREQnSCEJERIIUECIiEqSAEBGRIAWEiIgEKSBERCTo/wOh2g5veB4j7QAAAABJRU5ErkJggg==\n",
|
| 212 |
-
"text/plain": [
|
| 213 |
-
"<Figure size 432x288 with 1 Axes>"
|
| 214 |
-
]
|
| 215 |
-
},
|
| 216 |
-
"metadata": {
|
| 217 |
-
"needs_background": "light"
|
| 218 |
-
},
|
| 219 |
-
"output_type": "display_data"
|
| 220 |
-
}
|
| 221 |
-
],
|
| 222 |
-
"source": [
|
| 223 |
-
"# Plotting our loss charts\n",
|
| 224 |
-
"import matplotlib.pyplot as plt\n",
|
| 225 |
-
"\n",
|
| 226 |
-
"history_dict = history.history\n",
|
| 227 |
-
"\n",
|
| 228 |
-
"loss_values = history_dict['loss']\n",
|
| 229 |
-
"val_loss_values = history_dict['val_loss']\n",
|
| 230 |
-
"epochs = range(1, len(loss_values) + 1)\n",
|
| 231 |
-
"\n",
|
| 232 |
-
"line1 = plt.plot(epochs, val_loss_values, label='Validation/Test Loss')\n",
|
| 233 |
-
"line2 = plt.plot(epochs, loss_values, label='Training Loss')\n",
|
| 234 |
-
"plt.setp(line1, linewidth=2.0, marker = '+', markersize=10.0)\n",
|
| 235 |
-
"plt.setp(line2, linewidth=2.0, marker = '4', markersize=10.0)\n",
|
| 236 |
-
"plt.xlabel('Epochs') \n",
|
| 237 |
-
"plt.ylabel('Loss')\n",
|
| 238 |
-
"plt.grid(True)\n",
|
| 239 |
-
"plt.legend()\n",
|
| 240 |
-
"plt.show()"
|
| 241 |
-
]
|
| 242 |
-
},
|
| 243 |
-
{
|
| 244 |
-
"cell_type": "code",
|
| 245 |
-
"execution_count": 17,
|
| 246 |
-
"metadata": {},
|
| 247 |
-
"outputs": [
|
| 248 |
-
{
|
| 249 |
-
"data": {
|
| 250 |
-
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEJCAYAAACDscAcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nO3de3xU5bX/8c8iQFFBQLDQAhrsgXogJgECyAExlIpoj+CFKqi/Fq0iiqVHDypVWyn2dYqeeqlVsbQFtcXghQNy5FqpU/R4A+SiQJWLqAFEAbkEAQms3x+zGSdhkswmmUlCvu/Xa16Z/exnP7NWAlnZt2ebuyMiIpKsetUdgIiI1C4qHCIiEooKh4iIhKLCISIioahwiIhIKCocIiISSkoLh5kNNLP3zWydmY0tp98QM3Mzy4tryzazN8xslZm9a2aNgvZuwfI6M3vEzCyVOYiISEkpKxxmlgE8BlwAdAKGmVmnBP2aAKOBt+La6gN/BUa6e2cgHzgYrJ4IjAA6BK+BqcpBRESOVj+FY/cA1rn7BgAzmwYMBlaX6ncvcD8wJq5tALDS3VcAuPv2YIxvASe7+xvB8tPAxcDc8gJp2bKlZ2ZmVjaftNq7dy8nnXRSdYeRVsq5blDOtcfSpUu3ufuppdtTWTjaAJ/ELRcCPeM7mFkXoJ27v2Rm8YWjI+BmNh84FZjm7vcHYxaWGrNNRYFkZmayZMmSY8uimkQiEfLz86s7jLRSznWDcq49zOyjRO2pLByJzj3E5jcxs3rAQ8DwBP3qA32A7sCXwEIzWwrsLm/MEh9uNoLoIS1atWpFJBIJEXr1KyoqqnUxV5ZyrhuUc+2XysJRCLSLW24LbI5bbgJkAZHg/HZrYJaZDQq2/Ye7bwMwszlAV6LnPdqWM2aMu08CJgHk5eV5bav2tfUvlMpQznWDcq79UnlV1WKgg5m1N7OGwFBg1pGV7r7L3Vu6e6a7ZwJvAoPcfQkwH8g2sxODE+XnAqvdfQuwx8zODq6m+hHwYgpzEBGRUlK2x+HuxWZ2M9EikAFMdvdVZjYeWOLus8rZ9gsze5Bo8XFgjrvPDlbfCDwJnED0pHi5J8ZFaouDBw9SWFjI/v37qzuUlGratClr1qyp7jDSqqbn3KhRI9q2bUuDBg2S6p/KQ1W4+xxgTqm2X5bRN7/U8l+JHpoq3W8J0UNcIrXXaw9BzpUlmgoLC2nSpAmZmZkcz7cn7dmzhyZNmlR3GGlVk3N2d7Zv305hYSHt27dPahvdOS5SHYo+g//7XYmm/fv306JFi+O6aEjNY2a0aNEi1J6uCodIdej9M1jxDA0PfFGiWUVDqkPYf3cqHCLVoUlryBlGu0/+p9JDPfS3D6ogIJHkqXCIVJeO59OucBaMaxp97fwYNi+D3VtCDfO7hWurJJz8/Hzmz59fou3hhx/mpptuKne7xo0bA7B582aGDBlS5tgV3YT78MMP8+WXX8aWL7zwQnbu3JlM6Am98cYbtG/fntzcXHJzc2ncuDHf/e53yc3N5Uc/+lGosQ4fPsyECRPK7bN48WLMjIULFx5zzLWFCodIdfhqL8y5nTVn/gzG7Yq+mp0G3+4CJ3+rWkIaNmwY06ZNK9E2bdo0hg0bltT23/72t3nhhReO+fNLF445c+bQrFmzYx5v3rx5/Pa3v2X58uUsX76cvLw8pk6dyvLly3n66adDjZVM4SgoKKBPnz4UFBQcc8zJKC4uTun4yVDhEKkOs8dA2zy2tv5edUcSM2TIEF566SUOHDgAwMaNG9m8eTN9+vShqKiI/v3707VrV8466yxefPHo26c2btxIVlb0gsd9+/YxdOhQsrOzueKKK9i3b1+s3y233EJeXh6dO3fmnnvuAeCRRx5h8+bN9OvXj379+gHRqYK2bdsGwIMPPkhWVhZZWVk8/PDDsc/713/9V66//no6d+7MgAEDSnzOwoUL+f73v19mvsXFxdx666306NGD7Oxs/vSnPwGwadMm+vTpQ25uLllZWbz++uuMHTuWPXv2lLm3cvjwYaZPn85TTz3F3Llz+eqrr2LrpkyZQq9evcjJyeGaa64B4NNPP2Xw4MFkZ2eTk5PDW2+9xbp168jNzY1tN2HCBH79618D0KdPH+666y769u3Lo48+yosvvkjPnj3p0qULAwYM4LPPPgOiV2/9+Mc/5qyzziI7O5uZM2fyhz/8gdtuuy027sSJE7n99tvL/L4kI6WX44pIAsumwuZ34Pq/w+uLE3bJHDs7YXtZku2/ccIPylzXokULevTowbx58xg8eDDTpk3jiiuuwMxo1KgRM2bM4OSTT2bbtm2cffbZDBo0qMyTqhMnTuTEE09k5cqVrFy5kq5du8bW/eIXv+D000/n0KFD9O/fn5UrVzJ69GgefPBBXnnlFVq2bFlirKVLlzJlyhTeeust3J2ePXty7rnn0rx5c9auXUtBQQF//OMfufzyy5k+fTpXX30127Zto0GDBjRt2rTMfCdNmsQ3v/lN3n77bQ4cOMDZZ5/NgAEDKCgo4KKLLuKOO+7g0KFD7Nu3jx49evCnP/2J5cuXJxxr0aJFnHnmmZxxxhn07t2befPmMWjQIFasWMF9993H/PnzOf3009mxYwcAo0aN4rzzzuPmm2+muLiYL7/8MvbLvyy7d+9m0aJFAHzxxRex7/8TTzzBAw88wH333ce4ceM49dRTeffdd3F3du7cSf369cnNzeU3v/kN9evXZ8qUKTz55JPlflZFVDhE0m3v5/DDJ6FhzZst9cjhqiOFY/LkyUD0Wv8777yTRYsWUa9ePTZt2sTWrVtp3bp1wnEWLVrE6NGjAcjOziY7Ozu2bsaMGTz99NMUFxezZcsWVq9eXWJ9aa+99hqXXHJJbHbZSy+9lFdffZVBgwbFzmEAdOvWjY0bNwKwYMECBgwYUG6uCxYsYM2aNbHDc7t27WLt2rV0796dG264gf3793PxxReTk5NT4eGhgoIChg4dCsDQoUMpKChg0KBB/P3vf+eKK67glFNOAYh9jUQisc+tX78+J598coWF48j4AB9//DGXX345n376KQcOHKBjx44AvPzyy8ycOROIXinVvHlzAPr27cvcuXM544wzyMjIoFOno55wEYoKh0i69fmPCruUt2dQWubY2aH6l+fiiy/m1ltv5Z133mHfvn2xPYWpU6fy+eefs3TpUho0aEBmZmaF1/0n2hv58MMPeeSRR1i6dCnNmzdn+PDhFY7jnnAeUwC+8Y1vxN5nZGTEDlXNnTuXW2+9tcJxH3/8cfr373/UukgkwuzZs7nqqqv4+c9/zhVXXFHmOAcPHmTGjBnMmTOHX/3qVxw+fJidO3eyd+9e3L3MvbLS7fXr1+fw4cOx5f3791O//te/ouOnZR81ahR33nknF154IS+//HLs/EtZn3fdddfx4IMPkpmZGTtcVhk6xyEiMY0bNyY/P59rr722xEnxXbt28c1vfpMGDRrwyiuv8NFHCWfbjunbty9Tp04F4L333mPlypVA9HDLSSedRNOmTdm6dStz5349Y1CTJk3Ys2dPwrFmzpzJl19+yd69e5kxYwbnnHNOmZ/t7qxcubLE+YJEzj//fB5//PHY3sT777/Pvn37+Oijj2jdujUjRoxg+PDhLFu2LPYLPNGex4IFC+jevTuffPIJGzdu5OOPP+aiiy5i1qxZfP/732fatGmxQ1RHvvbr148nnngCgEOHDrF7925at27N5s2b+eKLL9i/fz+zZ5d9+HHXrl20adMGd+epp56KtQ8YMIBHH3009n344ovofUK9e/dm/fr1PP/88+UWwWSpcIhICcOGDWPFihUlDo1cddVVLFmyJHZl0plnnlnuGDfeeCNFRUVkZ2dz//3306NHDwBycnLIzs6mc+fOXHvttfTu3Tu2zYgRI7jgggtiJ8eP6Nq1K8OHD6dHjx707NmT6667ji5dupT52UuXLqVLly4V3tR2ww030KFDh9hJ8BtvvJHi4mIWLlxITk4OXbp04cUXX+SnP/0pAD/5yU/Izs4+6uR4QUEBl1xySYm2yy67jGeeeYbs7Gxuv/12LrjgAnJzc2MnqR999FHmz5/PWWedRV5eHv/85z9p1KgRd955J927d2fQoEHlHk4aN24cl1xyCeeeey6tWrWKtd9zzz1s3bqVrKwscnNzefXVV2PrhgwZQt++fcs975M0dz/uX926dfPa5pVXXqnuENKurue8evXqYxrj9DteqqJo0mP37t0pHf/ee+/1goKClH5GWKnOORnnn3++RyKRMtcn+vdHdELao36n6hyHSC33s/4dqjuEGuXuu++u7hBqlO3bt9OrVy+6devGueeeWyVjqnCI1HK3nNexukOQGqxFixZ88EHVTkujcxwiIhKKCoeIiISiwiEiIqGocIiISCgqHCK11WsPwZ6tVTLU9u3bY9OPt27dmjZt2sSW4yfsK88111zD+++/X26fxx57jGeffbYqQgZg69at1K9fnz//+c9VNqZUTFdVidRWRx4/O/C/Kj1UixYtYhP4jRs3jsaNGzNmzJgSfY5cw1+vXuK/N6dMmVLh54waNSrh3eHH6tlnn6VXr14UFBTwk5/8pMrGLa24uLjE9B91nfY4RGqr4PGzVbXXkci6devIyspi5MiRdO3alS1btjBixIjYtOjjx4+P9e3Tpw/Lly+nuLiYZs2aMXbsWHJycujVq1dsAr+7776bxx57LNZ/7Nix9OjRg+9+97u8/vrrAOzdu5fLLruMnJwchg0bRl5eXpmz0hYUFPDwww+zYcMGPv3001j77Nmz6dq1Kzk5ObHJDhNNOX4k1iOmTZvGddddB8DVV1/Nf/7nf9KvXz/uvPNO3nzzTXr16kWXLl3o3bs3a9dGH6BVXFzMLbfcQlZWFtnZ2Tz++OPMnz+fH/7wh7FxFyxYwOWXX17pn0dNoRIqUhONCzEtxAMh7uMYtyt0KKtXr2bKlCmxuZUmTJjAKaecQnFxMf369WPIkCFHTY+xa9cuzj33XCZMmMCtt97K5MmTGTt27FFjuztvv/02s2bNYvz48cybN4/f//73tG7dmunTp7NixYoSU7LH27hxI1988QXdunVjyJAhPPfcc4wePZpPP/2UG2+8kVdffbXEVOaJphyvyPr161m4cCH16tVj165dvPbaa2RkZDBv3jzuvvtunn32WSZOnMjmzZtZsWIFGRkZ7Nixg2bNmjF69Gi2b99OixYtmDp1Ktdff33Yb32NpT0OESnXd77zHbp37x5bLigooGvXrnTt2pU1a9awevXqo7Y54YQTuOCCC4CS052Xdumllx7V57XXXovNk5WTk0Pnzp0TbltQUBCbsO/IVOYQfWRsv379OP3004GvpzJ/+eWXGTVqFFByyvHy/PCHP4wdmtu5cyeXXnopWVlZjBkzhlWrVsXGHTlyJBkZGbHPq1evHldeeSXPPPMMO3bsYNmyZRVO816baI9DpCZKZs/gq70wqV90mvbcK1MWSvx03mvXruV3v/sdb7/9Ns2aNePqq69OOC16w4YNY+8zMjLKfJ7FkWnR4/t4OdOoxysoKGD79u2x2WE3b97Mhx9+WObU4ona69WrV+LzSucSn/tdd93F+eefz0033cS6desYOHBgmeMCXHvttVx22WVAdNLDI4XleKA9DpHaKnj8bCqLRmm7d++mSZMmnHzyyWzZsoX58+dX+Wf06dOH5557DoB333034R7N6tWrOXToEJs2bWLjxo1s3LiR2267jWnTptG7d2/+/ve/x6Z+P3KoKtGU4/Xq1Ys9SfDw4cPMmDGjzLiOTGUOlHiC3oABA5g4cSKHDh0q8Xnt2rWjZcuWTJgwgSuvTN/PKB1SWjjMbKCZvW9m68zs6AOcX/cbYmZuZnnBcqaZ7TOz5cHribi+w8zsXTNbaWbzzKxlWeOKHLeOPH72wv9O68d27dqVTp06kZWVxfXXX19iWvSq8tOf/pRNmzaRnZ3NAw88QFZW1lFTgT/zzDNlTmXeqlUrJk6cyODBg8nJyeGqq64Cyp5y/L777mPgwIH079+ftm3blhnXHXfcwW233XZUzjfccAOtW7eOPT/8SNEDuPLKK2nfvj0dOhxnE1EmmjK3Kl5ABrAeOANoCKwAOiXo1wRYBLwJ5AVtmcB7CfrWBz4DWgbL9wPjKopF06rXDnU951DTqr/6kPvWY5uGvbpVNMX4wYMHfd++fe7u/sEHH3hmZqYfPHgwHaFVuRtuuMGffPLJGjGtekVqyrTqPYB17r4BwMymAYOB0vud9wYFYAwVs+B1kpltB04G1lVZxCK1RRKPn62tioqK6N+/P8XFxbg7f/jDH2rlPRS5ubk0b96cRx55hAMHDlR3OFUqlT+NNsAnccuFQM/4DmbWBWjn7i+ZWenC0d7MlgG7gbvd/VV3P2hmNwLvAnuBtcColGUgImnXrFkzli5dWt1hVFr8vScqHMlL9NzG2OULZlYPeAgYnqDfFuA0d99uZt2AmWbWGdgH3Ah0ATYAvwd+Dvz6qA83GwGMAGjVqhWRSKQyuaRdUVFRrYu5sup6zk2bNmX37t0VPvK0tjt06FCV3j1eG9T0nN2d/fv3J/3/L5WFoxBoF7fcFtgct9wEyAIiwX+U1sAsMxvk7kuAAwDuvtTM1gMdCYqRu68HMLPngIQn3d19EjAJIC8vz/Pz86sssXSIRCLUtpgrq67n/OGHH/LVV1/RokWL47p47NmzhyZNmlR3GGlVk3N2d7Zv306zZs3KfZZ7vFQWjsVABzNrD2wChgKxa9LcfRcQuyLKzCLAGHdfYmanAjvc/ZCZnQF0ILqH0QjoZGanuvvnwHnAmhTmIJI2bdu2pbCwkM8//7y6Q0mp/fv306hRo+oOI61qes6NGjUq94qy0lJWONy92MxuBuYTvcJqsruvMrPxRM/Uzypn877AeDMrBg4BI919B4CZ/QpYZGYHgY9IfKhLpNZp0KAB7du3r+4wUi4SiST9l+3x4njLOaWXKrj7HGBOqbZfltE3P+79dGB6Gf2eAJ5ItE5ERFJPd46LiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCIiEooKh4iIhKLCISIioahwiIhIKCocIiISigqHiIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCIiEooKh4iIhKLCISIioahwiIhIKCocIiISigqHiIiEosIhIiKhqHCIiEgoKS0cZjbQzN43s3VmNracfkPMzM0sL1jONLN9ZrY8eD0R17ehmU0ysw/M7J9mdlkqcxARkZLqp2pgM8sAHgPOAwqBxWY2y91Xl+rXBBgNvFVqiPXunptg6LuAz9y9o5nVA06p+uhFRKQsqdzj6AGsc/cN7v4VMA0YnKDfvcD9wP4kx70W+A2Aux92921VEayIiCQnZXscQBvgk7jlQqBnfAcz6wK0c/eXzGxMqe3bm9kyYDdwt7u/ambNgnX3mlk+sB642d23lv5wMxsBjABo1aoVkUikClJKn6KioloXc2Up57pBOdd+qSwclqDNYyujh5keAoYn6LcFOM3dt5tZN2CmmXUmGm9b4P/c/VYzuxX4LfD/jvog90nAJIC8vDzPz8+vXDZpFolEqG0xV5ZyrhuUc+2XykNVhUC7uOW2wOa45SZAFhAxs43A2cAsM8tz9wPuvh3A3ZcS3bPoCGwHvgRmBGM8D3RNYQ4iIlJKKgvHYqCDmbU3s4bAUGDWkZXuvsvdW7p7prtnAm8Cg9x9iZmdGpxcx8zOADoAG9zdgf8F8oNh+gMlTraLiEhqpexQlbsXm9nNwHwgA5js7qvMbDywxN1nlbN5X2C8mRUDh4CR7r4jWHcH8Bczexj4HLgmVTmIiMjRUnmOA3efA8wp1fbLMvrmx72fDkwvo99HRAuLiIhUA905LiIioahwiIhIKCocIiISigqHiIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCIiEooKh4iIhKLCISIioVRYOMzsZjNrno5gRESk5ktmj6M1sNjMnjOzgWaW6JGwIiJSR1RYONz9bqJP4Psz0eeDrzWz/zKz76Q4NhERqYGSOscRPLL10+BVDDQHXjCz+1MYm4iI1EAVPgHQzEYDPwa2AX8CbnP3g2ZWD1gL3J7aEEVEpCZJ5tGxLYFLg0e2xrj7YTP799SEJSIiNVUyh6rmADuOLJhZEzPrCeDua1IVmIiI1EzJFI6JQFHc8t6gTURE6qBkCocFJ8eB6CEqkjvEJSIix6FkCscGMxttZg2C18+ADakOTEREaqZkCsdI4N+ATUAh0BMYkcqgRESk5qrwkJO7fwYMTUMsIiJSCyQzV1UjMxtlZo+b2eQjr2QGD6Yoed/M1pnZ2HL6DTEzN7O8YDnTzPaZ2fLg9USCbWaZ2XvJxCEiIlUnmUNVfyE6X9X5wD+AtsCeijYyswzgMeACoBMwzMw6JejXBBgNvFVq1Xp3zw1eI0ttcyklr/QSEZE0SaZw/Iu7/wLY6+5PAT8Azkpiux7AOnff4O5fAdOAwQn63QvcD+xPJmAzawzcCvw6mf4iIlK1krms9mDwdaeZZRGdryozie3aAJ/ELR85sR5jZl2Adu7+kpmNKbV9ezNbBuwG7nb3V4P2e4EHgC/L+3AzG0FwEr9Vq1ZEIpEkQq45ioqKal3MlaWc6wblXPslUzgmBc/juBuYBTQGfpHEdommX4/dDxLMdfUQ0Rl3S9sCnObu282sGzDTzDoDZxDdA7rFzDLL+3B3nwRMAsjLy/P8/PwkQq45IpEItS3mylLOdYNyrv3KLRzBL/fd7v4FsIjoL+5kFQLt4pbbApvjlpsAWUAkeMRHa2CWmQ1y9yXAAQB3X2pm64GOQHegm5ltDGL/pplF3D0/RFwiIlIJ5Z7jCO4Sv/kYx14MdDCz9mbWkOglvbPixt7l7i3dPdPdM4E3gUHuvsTMTg1OrmNmZxB9HsgGd5/o7t8O+vcBPlDREBFJr2ROjv/NzMaYWTszO+XIq6KN3L2YaNGZD6wBnnP3VWY23swGVbB5X2Clma0AXgBGuvuOCrYREZE0SOYcx7XB11FxbU4Sh63cfQ7R2XXj235ZRt/8uPfTgekVjL2R6KEuERFJo2TuHG+fjkBERKR2SOYJgD9K1O7uT1d9OCIiUtMlc6iqe9z7RkB/4B1AhUNEpA5K5lDVT+OXzawp0WlIRESkDkrmqqrSviR6eayIiNRByZzj+F++vuO7HtEJC59LZVAiIlJzJXOO47dx74uBj9y9MEXxiIhIDZdM4fgY2OLu+wHM7AQzywzuoxARkTommXMczwOH45YPBW0iIlIHJVM46gfP0wAgeN8wdSGJiEhNlkzh+Dx+bikzGwxsS11IIiJSkyVzjmMkMNXMHg2WC4GEd5OLiMjxL5kbANcDZwePbDV3r/B54yIicvyq8FCVmf2XmTVz9yJ332Nmzc1Mz/sWEamjkjnHcYG77zyyEDwN8MLUhSQiIjVZMoUjw8y+cWTBzE4AvlFOfxEROY4lc3L8r8BCM5sSLF8DPJW6kEREpCZL5uT4/Wa2Evg+YMA84PRUByYiIjVTsrPjfkr07vHLiD6PY03KIhIRkRqtzD0OM+sIDAWGAduBZ4lejtsvTbGJiEgNVN6hqn8CrwIXufs6ADO7JS1RiYhIjVXeoarLiB6iesXM/mhm/Yme4xARkTqszMLh7jPc/QrgTCAC3AK0MrOJZjYgTfGJiEgNU+HJcXff6+5T3f3fgbbAcmBsyiMTEZEaKdQzx919h7v/wd2/l0x/MxtoZu+b2TozK7PYmNkQM3MzywuWM81sn5ktD15PBO0nmtlsM/unma0yswlh4hcRkcpL5gbAY2JmGcBjwHlEZ9RdbGaz3H11qX5NgNHAW6WGWO/uuQmG/q27v2JmDYnemHiBu89NQQoiIpJAqD2OkHoA69x9Q/Dwp2nA4AT97gXuB/ZXNKC7f+nurwTvvwLeIXr4TERE0iSVhaMN8EnccmHQFmNmXYB27v5Sgu3bm9kyM/uHmZ1TeqWZNQMuAhZWYcwiIlKBlB2qIvGlux5baVYPeAgYnqDfFuA0d99uZt2AmWbW2d13B9vWBwqAR9x9Q8IPNxsBjABo1aoVkUikEqmkX1FRUa2LubKUc92gnGu/VBaOQqBd3HJbYHPcchMgC4iYGUBrYJaZDXL3JcABAHdfambrgY7AkmDbScBad3+4rA9390lBP/Ly8jw/P78qckqbSCRCbYu5spRz3aCca79UHqpaDHQws/bBieyhwKwjK919l7u3dPdMd88E3gQGufsSMzs1OLmOmZ0BdAA2BMu/BpoC/5HC2EVEpAwpKxzuXgzcDMwnOinic+6+yszGm9mgCjbvC6w0sxXAC8BId99hZm2Bu4BOwDvBpbrXpSoHERE5WioPVeHuc4A5pdp+WUbf/Lj304HpCfoUomlPRESqVSoPVYmIyHFIhUNEREJR4RARkVBUOEREJBQVDhERCUWFQ0REQlHhEBGRUFQ4REQkFBUOEREJRYVDRERCUeEQEZFQVDhERCQUFQ4REQlFhUNEREJR4RARkVBUOEREJBQVDhERCUWFQ0REQlHhEBGRUFQ4REQkFBUOEREJRYVDRERCUeEQEZFQVDhERCQUFQ4REQklpYXDzAaa2ftmts7MxpbTb4iZuZnlBcuZZrbPzJYHryfi+nYzs3eDMR8xM0tlDiIiUlL9VA1sZhnAY8B5QCGw2MxmufvqUv2aAKOBt0oNsd7dcxMMPREYAbwJzAEGAnOrOHwRESlDKvc4egDr3H2Du38FTAMGJ+h3L3A/sL+iAc3sW8DJ7v6GuzvwNHBxFcYsIiIVSGXhaAN8ErdcGLTFmFkXoJ27v5Rg+/ZmtszM/mFm58SNWVjemCIiklopO1QFJDr34LGVZvWAh4DhCfptAU5z9+1m1g2YaWadKxqzxIebjSB6SItWrVoRiURCBV/dioqKal3MlaWc6wblXPulsnAUAu3iltsCm+OWmwBZQCQ4v90amGVmg9x9CXAAwN2Xmtl6oGMwZttyxoxx90nAJIC8vDzPz8+vgpTSJxKJUNtirizlXMbPx4YAAAmtSURBVDco59ovlYeqFgMdzKy9mTUEhgKzjqx0913u3tLdM909k+jJ7kHuvsTMTg1OrmNmZwAdgA3uvgXYY2ZnB1dT/Qh4MYU5iIhIKSnb43D3YjO7GZgPZACT3X2VmY0Hlrj7rHI27wuMN7Ni4BAw0t13BOtuBJ4ETiB6NZWuqBIRSaNUHqrC3ecQvWQ2vu2XZfTNj3s/HZheRr8lRA9xiYhINdCd4yIiEooKh4iIhKLCISIioahwiIhIKCocIiISigqHiIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCIiEooKh4iIhKLCISIioahwiIhIKCocIiISigqHiIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiISiwiEiIqGocIiISCgqHCIiEkpKC4eZDTSz981snZmNLaffEDNzM8sr1X6amRWZ2Zi4tlvMbJWZvWdmBWbWKJU5iIhISSkrHGaWATwGXAB0AoaZWacE/ZoAo4G3EgzzEDA3rm+boG+eu2cBGcDQqo9eRETKkso9jh7AOnff4O5fAdOAwQn63QvcD+yPbzSzi4ENwKpS/esDJ5hZfeBEYHNVBy6SLjPWflXdIYiElsrC0Qb4JG65MGiLMbMuQDt3f6lU+0nAHcCv4tvdfRPwW+BjYAuwy90XVH3oIunx4vqD1R2CSGj1Uzi2JWjz2EqzekQPRQ1P0O9XwEPuXmT29TBm1pzoXkt7YCfwvJld7e5/PerDzUYAIwBatWpFJBI55kSqQ1FRUa2LubLqYs5Ancu5Lv6cj7ecU1k4CoF2ccttKXlYqQmQBUSC4tAamGVmg4CewBAzux9oBhw2s/3AVuBDd/8cwMz+B/g34KjC4e6TgEkAeXl5np+fX6XJpVokEqG2xVxZdTFn5s2ucznXxZ/z8ZZzKgvHYqCDmbUHNhE9iX3lkZXuvgtoeWTZzCLAGHdfApwT1z4OKHL3R82sJ3C2mZ0I7AP6A0tSmIOIiJSSssLh7sVmdjMwn+jVT5PdfZWZjQeWuPusYxjzLTN7AXgHKAaWEexViNR0D/3tA363cO1R7ZljZ5dY/ln/DtxyXsd0hSUSWir3OHD3OcCcUm2/LKNvfhnt40ot3wPcUzURiqTPLed1PKogZI6dzcYJP6imiESOje4cFxGRUFQ4REQkFBUOEREJRYVDpBoN/k6D6g5BJDQVDpFqdEmHhtUdgkhoKhwiIhKKCoeIiISiwiEiIqGYu1fcq5Yzs8+Bj6o7jpBaAtuqO4g0U851g3KuPU5391NLN9aJwlEbmdkSd8+ruOfxQznXDcq59tOhKhERCUWFQ0REQlHhqLnq4qy/yrluUM61nM5xiIhIKNrjEBGRUFQ4qoGZDTSz981snZmNTbD+dDNbaGYrzSxiZm3j1p1mZgvMbI2ZrTazzHTGfqwqmfP9ZrYqyPkRi38QfQ1lZpPN7DMze6+M9Rbksi7IuWvcuh+b2drg9eP0RV05x5qzmeWa2RvBz3ilmV2R3siPXWV+zsH6k81sk5k9mp6Iq4i765XGF9GnIa4HzgAaAiuATqX6PA/8OHj/PeAvcesiwHnB+8bAidWdUypzJvpM+f8LxsgA3gDyqzunJHLuC3QF3itj/YXAXMCAs4G3gvZTgA3B1+bB++bVnU+Kc+4IdAjefxvYAjSr7nxSmXPc+t8BzwCPVncuYV7a40i/HsA6d9/g7l8B04DBpfp0AhYG7185st7MOgH13f1vAO5e5O5fpifsSjnmnAEHGhEtON8AGgBbUx5xJbn7ImBHOV0GA0971JtAMzP7FnA+8Dd33+HuXwB/AwamPuLKO9ac3f0Dd18bjLEZ+Aw46qazmqgSP2fMrBvQCliQ+kirlgpH+rUBPolbLgza4q0ALgveXwI0MbMWRP8y22lm/2Nmy8zsv80sI+URV94x5+zubxAtJFuC13x3X5PieNOhrO9JMt+r2qrC3MysB9E/EtanMa5USpizmdUDHgBuq5aoKkmFI/0SHZ8vfWnbGOBcM1sGnAtsAoqJPiP+nGB9d6KHfoanLNKqc8w5m9m/AP8KtCX6n/B7ZtY3lcGmSVnfk2S+V7VVubkFf4n/BbjG3Q+nLarUKivnm4A57v5JgvU1Xv3qDqAOKgTaxS23BTbHdwh21y8FMLPGwGXuvsvMCoFl7r4hWDeT6HHTP6cj8EqoTM4jgDfdvShYN5dozovSEXgKlfU9KQTyS7VH0hZVapX578DMTgZmA3cHh3SOF2Xl3As4x8xuInqusqGZFbn7UReO1ETa40i/xUAHM2tvZg2BocCs+A5m1jLYlQX4OTA5btvmZnbk+O/3gNVpiLmyKpPzx0T3ROqbWQOieyPHw6GqWcCPgqtuzgZ2ufsWYD4wwMyam1lzYEDQdjxImHPwb2IG0XMBz1dviFUuYc7ufpW7n+bumUT3tp+uLUUDtMeRdu5ebGY3E/1lkAFMdvdVZjYeWOLus4j+xfkbM3Oif1mPCrY9ZGZjgIXBJalLgT9WRx5hVCZn4AWiBfJdorv489z9f9OdQ1hmVkA0p5bBnuI9RE/s4+5PAHOIXnGzDvgSuCZYt8PM7iVabAHGu3t5J19rjGPNGbic6NVJLcxseNA23N2Xpy34Y1SJnGs13TkuIiKh6FCViIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiISiwiFyjMzskJktj3tV2XX4ZpZZ1oyrItVN93GIHLt97p5b3UGIpJv2OESqmJltNLP7zOzt4PUvQXv8M0cWmtlpQXsrM5thZiuC178FQ2WY2R+D51QsMLMTgv6jLfoslpVmNq2a0pQ6TIVD5NidUOpQVfwDiHa7ew/gUeDhoO1RolNLZANTgUeC9keAf7h7DtFnO6wK2jsAj7l7Z2AnX88ePBboEowzMlXJiZRFd46LHKNgUrrGCdo3At9z9w3B/FqfunsLM9sGfMvdDwbtW9y9pZl9DrR19wNxY2QSfS5Hh2D5DqCBu//azOYBRcBMYOaRCSBF0kV7HCKp4WW8L6tPIgfi3h/i63OSPwAeA7oBS81M5yolrVQ4RFLjirivbwTvXyc6MzDAVcBrwfuFwI0AZpYRTDGeUDCDcDt3fwW4HWhGdFpukbTRXyoix+4EM4ufwXVe3NTY3zCzt4j+cTYsaBsNTDaz24DP+Xqm1J8Bk8zsJ0T3LG4k+rTDRDKAv5pZU6IPCXrI3XdWWUYiSdA5DpEqFpzjyHP3bdUdi0gq6FCViIiEoj0OEREJRXscIiISigqHiIiEosIhIiKhqHCIiEgoKhwiIhKKCoeIiITy/wFSgyl52ulROAAAAABJRU5ErkJggg==\n",
|
| 251 |
-
"text/plain": [
|
| 252 |
-
"<Figure size 432x288 with 1 Axes>"
|
| 253 |
-
]
|
| 254 |
-
},
|
| 255 |
-
"metadata": {
|
| 256 |
-
"needs_background": "light"
|
| 257 |
-
},
|
| 258 |
-
"output_type": "display_data"
|
| 259 |
-
}
|
| 260 |
-
],
|
| 261 |
-
"source": [
|
| 262 |
-
"# Plotting our accuracy charts\n",
|
| 263 |
-
"import matplotlib.pyplot as plt\n",
|
| 264 |
-
"\n",
|
| 265 |
-
"history_dict = history.history\n",
|
| 266 |
-
"\n",
|
| 267 |
-
"acc_values = history_dict['accuracy']\n",
|
| 268 |
-
"val_acc_values = history_dict['val_accuracy']\n",
|
| 269 |
-
"epochs = range(1, len(loss_values) + 1)\n",
|
| 270 |
-
"\n",
|
| 271 |
-
"line1 = plt.plot(epochs, val_acc_values, label='Validation/Test Accuracy')\n",
|
| 272 |
-
"line2 = plt.plot(epochs, acc_values, label='Training Accuracy')\n",
|
| 273 |
-
"plt.setp(line1, linewidth=2.0, marker = '+', markersize=10.0)\n",
|
| 274 |
-
"plt.setp(line2, linewidth=2.0, marker = '4', markersize=10.0)\n",
|
| 275 |
-
"plt.xlabel('Epochs') \n",
|
| 276 |
-
"plt.ylabel('Accuracy')\n",
|
| 277 |
-
"plt.grid(True)\n",
|
| 278 |
-
"plt.legend()\n",
|
| 279 |
-
"plt.show()"
|
| 280 |
-
]
|
| 281 |
-
},
|
| 282 |
-
{
|
| 283 |
-
"cell_type": "markdown",
|
| 284 |
-
"metadata": {},
|
| 285 |
-
"source": [
|
| 286 |
-
"### Let's run some tests"
|
| 287 |
-
]
|
| 288 |
-
},
|
| 289 |
-
{
|
| 290 |
-
"cell_type": "code",
|
| 291 |
-
"execution_count": 19,
|
| 292 |
-
"metadata": {},
|
| 293 |
-
"outputs": [],
|
| 294 |
-
"source": [
|
| 295 |
-
"import cv2\n",
|
| 296 |
-
"import numpy as np\n",
|
| 297 |
-
"from tensorflow.keras.models import load_model\n",
|
| 298 |
-
"\n",
|
| 299 |
-
"img_row, img_height, img_depth = 32,32,3\n",
|
| 300 |
-
"classifier = load_model('cifar_simple_cnn_2.h5')\n",
|
| 301 |
-
"color = True \n",
|
| 302 |
-
"scale = 8\n",
|
| 303 |
-
"\n",
|
| 304 |
-
"def draw_test(name, res, input_im, scale, img_row, img_height):\n",
|
| 305 |
-
" BLACK = [0,0,0]\n",
|
| 306 |
-
" res = int(res)\n",
|
| 307 |
-
" if res == 0:\n",
|
| 308 |
-
" pred = \"airplane\"\n",
|
| 309 |
-
" if res == 1:\n",
|
| 310 |
-
" pred = \"automobile\"\n",
|
| 311 |
-
" if res == 2:\n",
|
| 312 |
-
" pred = \"bird\"\n",
|
| 313 |
-
" if res == 3:\n",
|
| 314 |
-
" pred = \"cat\"\n",
|
| 315 |
-
" if res == 4:\n",
|
| 316 |
-
" pred = \"deer\"\n",
|
| 317 |
-
" if res == 5:\n",
|
| 318 |
-
" pred = \"dog\"\n",
|
| 319 |
-
" if res == 6:\n",
|
| 320 |
-
" pred = \"frog\"\n",
|
| 321 |
-
" if res == 7:\n",
|
| 322 |
-
" pred = \"horse\"\n",
|
| 323 |
-
" if res == 8:\n",
|
| 324 |
-
" pred = \"ship\"\n",
|
| 325 |
-
" if res == 9:\n",
|
| 326 |
-
" pred = \"truck\"\n",
|
| 327 |
-
" \n",
|
| 328 |
-
" expanded_image = cv2.copyMakeBorder(input_im, 0, 0, 0, imageL.shape[0]*2 ,cv2.BORDER_CONSTANT,value=BLACK)\n",
|
| 329 |
-
" if color == False:\n",
|
| 330 |
-
" expanded_image = cv2.cvtColor(expanded_image, cv2.COLOR_GRAY2BGR)\n",
|
| 331 |
-
" cv2.putText(expanded_image, str(pred), (300, 80) , cv2.FONT_HERSHEY_COMPLEX_SMALL,3, (0,255,0), 2)\n",
|
| 332 |
-
" cv2.imshow(name, expanded_image)\n",
|
| 333 |
-
"\n",
|
| 334 |
-
"\n",
|
| 335 |
-
"for i in range(0,10):\n",
|
| 336 |
-
" rand = np.random.randint(0,len(x_test))\n",
|
| 337 |
-
" input_im = x_test[rand]\n",
|
| 338 |
-
" imageL = cv2.resize(input_im, None, fx=scale, fy=scale, interpolation = cv2.INTER_CUBIC) \n",
|
| 339 |
-
" input_im = input_im.reshape(1,img_row, img_height, img_depth) \n",
|
| 340 |
-
" \n",
|
| 341 |
-
" ## Get Prediction\n",
|
| 342 |
-
" res = str(classifier.predict_classes(input_im, 1, verbose = 0)[0])\n",
|
| 343 |
-
" \n",
|
| 344 |
-
" draw_test(\"Prediction\", res, imageL, scale, img_row, img_height) \n",
|
| 345 |
-
" cv2.waitKey(0)\n",
|
| 346 |
-
"\n",
|
| 347 |
-
"cv2.destroyAllWindows()"
|
| 348 |
-
]
|
| 349 |
-
},
|
| 350 |
-
{
|
| 351 |
-
"cell_type": "code",
|
| 352 |
-
"execution_count": null,
|
| 353 |
-
"metadata": {},
|
| 354 |
-
"outputs": [],
|
| 355 |
-
"source": []
|
| 356 |
-
}
|
| 357 |
-
],
|
| 358 |
-
"metadata": {
|
| 359 |
-
"kernelspec": {
|
| 360 |
-
"display_name": "Python 3",
|
| 361 |
-
"language": "python",
|
| 362 |
-
"name": "python3"
|
| 363 |
-
},
|
| 364 |
-
"language_info": {
|
| 365 |
-
"codemirror_mode": {
|
| 366 |
-
"name": "ipython",
|
| 367 |
-
"version": 3
|
| 368 |
-
},
|
| 369 |
-
"file_extension": ".py",
|
| 370 |
-
"mimetype": "text/x-python",
|
| 371 |
-
"name": "python",
|
| 372 |
-
"nbconvert_exporter": "python",
|
| 373 |
-
"pygments_lexer": "ipython3",
|
| 374 |
-
"version": "3.7.4"
|
| 375 |
-
}
|
| 376 |
-
},
|
| 377 |
-
"nbformat": 4,
|
| 378 |
-
"nbformat_minor": 2
|
| 379 |
-
}
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:3256af9109e9e327208bf0a65c7b29c4cc9ab2059243e19885a2465c6399958b
|
| 3 |
+
size 38096
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8. Making a CNN in Keras/8.3 to 8.10 - Building a CNN for handwritten digits - MNIST.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
9. Visualizing What CNNs 'see' & Filter Visualization/9.1 Activation Maximization using Keras Visualization Toolkit.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
9. Visualizing What CNNs 'see' & Filter Visualization/9.2 Saliency Maps.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
9. Visualizing What CNNs 'see' & Filter Visualization/9.3A Visualizing Filter Patterns.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
9. Visualizing What CNNs 'see' & Filter Visualization/9.3B Visualizing Filter Patterns - VGG16.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|
9. Visualizing What CNNs 'see' & Filter Visualization/9.4 Heat Map Visualizations of Class Activation.ipynb
CHANGED
|
The diff for this file is too large to render.
See raw diff
|
|
|