diff --git a/README.md b/README.md index aea002a2..9c205400 100644 --- a/README.md +++ b/README.md @@ -13,38 +13,42 @@ It is suitable for beginners who want to find clear and concise examples about T - [Introduction to MNIST Dataset](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/0_Prerequisite/mnist_dataset_intro.ipynb). #### 1 - Introduction -- **Hello World** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/1_Introduction/helloworld.ipynb)). Very simple example to learn how to print "hello world" using TensorFlow 2.0. -- **Basic Operations** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/1_Introduction/basic_operations.ipynb)). A simple example that cover TensorFlow 2.0 basic operations. +- **Hello World** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/1_Introduction/helloworld.ipynb)). Very simple example to learn how to print "hello world" using TensorFlow 2.0+. +- **Basic Operations** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/1_Introduction/basic_operations.ipynb)). A simple example that cover TensorFlow 2.0+ basic operations. #### 2 - Basic Models -- **Linear Regression** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/linear_regression.ipynb)). Implement a Linear Regression with TensorFlow 2.0. -- **Logistic Regression** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/logistic_regression.ipynb)). Implement a Logistic Regression with TensorFlow 2.0. -- **Word2Vec (Word Embedding)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/word2vec.ipynb)). Build a Word Embedding Model (Word2Vec) from Wikipedia data, with TensorFlow 2.0. +- **Linear Regression** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/linear_regression.ipynb)). Implement a Linear Regression with TensorFlow 2.0+. +- **Logistic Regression** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/logistic_regression.ipynb)). Implement a Logistic Regression with TensorFlow 2.0+. +- **Word2Vec (Word Embedding)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/word2vec.ipynb)). Build a Word Embedding Model (Word2Vec) from Wikipedia data, with TensorFlow 2.0+. +- **GBDT (Gradient Boosted Decision Trees)** ([notebooks](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/gradient_boosted_trees.ipynb)). Implement a Gradient Boosted Decision Trees with TensorFlow 2.0+ to predict house value using Boston Housing dataset. #### 3 - Neural Networks ##### Supervised - **Simple Neural Network** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/neural_network.ipynb)). Use TensorFlow 2.0 'layers' and 'model' API to build a simple neural network to classify MNIST digits dataset. - **Simple Neural Network (low-level)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/neural_network_raw.ipynb)). Raw implementation of a simple neural network to classify MNIST digits dataset. -- **Convolutional Neural Network** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/convolutional_network.ipynb)). Use TensorFlow 2.0 'layers' and 'model' API to build a convolutional neural network to classify MNIST digits dataset. +- **Convolutional Neural Network** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/convolutional_network.ipynb)). Use TensorFlow 2.0+ 'layers' and 'model' API to build a convolutional neural network to classify MNIST digits dataset. - **Convolutional Neural Network (low-level)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/convolutional_network_raw.ipynb)). Raw implementation of a convolutional neural network to classify MNIST digits dataset. - **Recurrent Neural Network (LSTM)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/recurrent_network.ipynb)). Build a recurrent neural network (LSTM) to classify MNIST digits dataset, using TensorFlow 2.0 'layers' and 'model' API. -- **Bi-directional Recurrent Neural Network (LSTM)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/bidirectional_rnn.ipynb)). Build a bi-directional recurrent neural network (LSTM) to classify MNIST digits dataset, using TensorFlow 2.0 'layers' and 'model' API. -- **Dynamic Recurrent Neural Network (LSTM)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/dynamic_rnn.ipynb)). Build a recurrent neural network (LSTM) that performs dynamic calculation to classify sequences of variable length, using TensorFlow 2.0 'layers' and 'model' API. +- **Bi-directional Recurrent Neural Network (LSTM)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/bidirectional_rnn.ipynb)). Build a bi-directional recurrent neural network (LSTM) to classify MNIST digits dataset, using TensorFlow 2.0+ 'layers' and 'model' API. +- **Dynamic Recurrent Neural Network (LSTM)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/dynamic_rnn.ipynb)). Build a recurrent neural network (LSTM) that performs dynamic calculation to classify sequences of variable length, using TensorFlow 2.0+ 'layers' and 'model' API. ##### Unsupervised - **Auto-Encoder** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/autoencoder.ipynb)). Build an auto-encoder to encode an image to a lower dimension and re-construct it. - **DCGAN (Deep Convolutional Generative Adversarial Networks)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/3_NeuralNetworks/dcgan.ipynb)). Build a Deep Convolutional Generative Adversarial Network (DCGAN) to generate images from noise. #### 4 - Utilities -- **Save and Restore a model** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/4_Utils/save_restore_model.ipynb)). Save and Restore a model with TensorFlow 2.0. -- **Build Custom Layers & Modules** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/4_Utils/build_custom_layers.ipynb)). Learn how to build your own layers / modules and integrate them into TensorFlow 2.0 Models. +- **Save and Restore a model** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/4_Utils/save_restore_model.ipynb)). Save and Restore a model with TensorFlow 2.0+. +- **Build Custom Layers & Modules** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/4_Utils/build_custom_layers.ipynb)). Learn how to build your own layers / modules and integrate them into TensorFlow 2.0+ Models. +- **Tensorboard** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/4_Utils/tensorboard.ipynb)). Track and visualize neural network computation graph, metrics, weights and more using TensorFlow 2.0+ tensorboard. #### 5 - Data Management - **Load and Parse data** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/5_DataManagement/load_data.ipynb)). Build efficient data pipeline with TensorFlow 2.0 (Numpy arrays, Images, CSV files, custom data, ...). -- **Build and Load TFRecords** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/5_DataManagement/tfrecords.ipynb)). Convert data into TFRecords format, and load them with TensorFlow 2.0. -- **Image Transformation (i.e. Image Augmentation)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/5_DataManagement/image_transformation.ipynb)). Apply various image augmentation techniques with TensorFlow 2.0, to generate distorted images for training. +- **Build and Load TFRecords** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/5_DataManagement/tfrecords.ipynb)). Convert data into TFRecords format, and load them with TensorFlow 2.0+. +- **Image Transformation (i.e. Image Augmentation)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/5_DataManagement/image_transformation.ipynb)). Apply various image augmentation techniques with TensorFlow 2.0+, to generate distorted images for training. +#### 6 - Hardware +- **Multi-GPU Training** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/6_Hardware/multigpu_training.ipynb)). Train a convolutional neural network with multiple GPUs on CIFAR-10 dataset. ## TensorFlow v1 @@ -134,3 +138,12 @@ The tutorial index for TF v1 is available here: [TensorFlow v1.15 Examples](tens - **Basic Operations on multi-GPU** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v1/notebooks/6_MultiGPU/multigpu_basics.ipynb)) ([code](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v1/examples/6_MultiGPU/multigpu_basics.py)). A simple example to introduce multi-GPU in TensorFlow. - **Train a Neural Network on multi-GPU** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v1/notebooks/6_MultiGPU/multigpu_cnn.ipynb)) ([code](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v1/examples/6_MultiGPU/multigpu_cnn.py)). A clear and simple TensorFlow implementation to train a convolutional neural network on multiple GPUs. +## More Examples +The following examples are coming from [TFLearn](https://github.com/tflearn/tflearn), a library that provides a simplified interface for TensorFlow. You can have a look, there are many [examples](https://github.com/tflearn/tflearn/tree/master/examples) and [pre-built operations and layers](http://tflearn.org/doc_index/#api). + +### Tutorials +- [TFLearn Quickstart](https://github.com/tflearn/tflearn/blob/master/tutorials/intro/quickstart.md). Learn the basics of TFLearn through a concrete machine learning task. Build and train a deep neural network classifier. + +### Examples +- [TFLearn Examples](https://github.com/tflearn/tflearn/blob/master/examples). A large collection of examples using TFLearn. + diff --git a/input_data.py b/input_data.py index d1d0d28e..0ebcbdad 100644 --- a/input_data.py +++ b/input_data.py @@ -90,7 +90,7 @@ def num_examples(self): def epochs_completed(self): return self._epochs_completed def next_batch(self, batch_size, fake_data=False): - """Return the next `batch_size` examples from this data set.""" + """Return the next `batch_size`examples from this data set.""" if fake_data: fake_image = [1.0 for _ in xrange(784)] fake_label = 0 @@ -141,4 +141,4 @@ class DataSets(object): data_sets.train = DataSet(train_images, train_labels) data_sets.validation = DataSet(validation_images, validation_labels) data_sets.test = DataSet(test_images, test_labels) - return data_sets \ No newline at end of file + return data_sets diff --git a/notebooks/3_NeuralNetworks/bidirectional_rnn.ipynb b/notebooks/3_NeuralNetworks/bidirectional_rnn.ipynb index 2435b229..9595cc50 100644 --- a/notebooks/3_NeuralNetworks/bidirectional_rnn.ipynb +++ b/notebooks/3_NeuralNetworks/bidirectional_rnn.ipynb @@ -23,7 +23,7 @@ "\"nn\"\n", "\n", "References:\n", - "- [Long Short Term Memory](http://deeplearning.cs.cmu.edu/pdfs/Hochreiter97_lstm.pdf), Sepp Hochreiter & Jurgen Schmidhuber, Neural Computation 9(8): 1735-1780, 1997.\n", + "- [Long Short Term Memory](https://www.researchgate.net/profile/Sepp_Hochreiter/publication/13853244_Long_Short-term_Memory/links/5700e75608aea6b7746a0624/Long-Short-term-Memory.pdf), Sepp Hochreiter & Jurgen Schmidhuber, Neural Computation 9(8): 1735-1780, 1997.\n", "\n", "## MNIST Dataset Overview\n", "\n", diff --git a/resources/img/tf2/tensorboard1.png b/resources/img/tf2/tensorboard1.png new file mode 100644 index 00000000..e0609a90 Binary files /dev/null and b/resources/img/tf2/tensorboard1.png differ diff --git a/resources/img/tf2/tensorboard2.png b/resources/img/tf2/tensorboard2.png new file mode 100644 index 00000000..27b746ef Binary files /dev/null and b/resources/img/tf2/tensorboard2.png differ diff --git a/resources/img/tf2/tensorboard3.png b/resources/img/tf2/tensorboard3.png new file mode 100644 index 00000000..cf8771ae Binary files /dev/null and b/resources/img/tf2/tensorboard3.png differ diff --git a/resources/img/tf2/tensorboard4.png b/resources/img/tf2/tensorboard4.png new file mode 100644 index 00000000..418f697e Binary files /dev/null and b/resources/img/tf2/tensorboard4.png differ diff --git a/tensorflow_v2/README.md b/tensorflow_v2/README.md index ed23a174..ef6785f1 100644 --- a/tensorflow_v2/README.md +++ b/tensorflow_v2/README.md @@ -14,6 +14,7 @@ - **Linear Regression** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/linear_regression.ipynb)). Implement a Linear Regression with TensorFlow 2.0. - **Logistic Regression** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/logistic_regression.ipynb)). Implement a Logistic Regression with TensorFlow 2.0. - **Word2Vec (Word Embedding)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/word2vec.ipynb)). Build a Word Embedding Model (Word2Vec) from Wikipedia data, with TensorFlow 2.0. +- **GBDT (Gradient Boosted Decision Trees)** ([notebooks](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/2_BasicModels/gradient_boosted_trees.ipynb)). Implement a Gradient Boosted Decision Trees with TensorFlow 2.0+ to predict house value using Boston Housing dataset. #### 3 - Neural Networks ##### Supervised @@ -33,12 +34,16 @@ #### 4 - Utilities - **Save and Restore a model** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/4_Utils/save_restore_model.ipynb)). Save and Restore a model with TensorFlow 2.0. - **Build Custom Layers & Modules** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/4_Utils/build_custom_layers.ipynb)). Learn how to build your own layers / modules and integrate them into TensorFlow 2.0 Models. +- **Tensorboard** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/4_Utils/tensorboard.ipynb)). Track and visualize neural network computation graph, metrics, weights and more using TensorFlow 2.0+ tensorboard. #### 5 - Data Management - **Load and Parse data** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/5_DataManagement/load_data.ipynb)). Build efficient data pipeline with TensorFlow 2.0 (Numpy arrays, Images, CSV files, custom data, ...). - **Build and Load TFRecords** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/5_DataManagement/tfrecords.ipynb)). Convert data into TFRecords format, and load them with TensorFlow 2.0. - **Image Transformation (i.e. Image Augmentation)** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/5_DataManagement/image_transformation.ipynb)). Apply various image augmentation techniques with TensorFlow 2.0, to generate distorted images for training. +#### 6 - Hardware +- **Multi-GPU Training** ([notebook](https://github.com/aymericdamien/TensorFlow-Examples/blob/master/tensorflow_v2/notebooks/6_Hardware/multigpu_training.ipynb)). Train a convolutional neural network with multiple GPUs on CIFAR-10 dataset. + ## Installation To install TensorFlow 2.0, simply run: diff --git a/tensorflow_v2/notebooks/0_Prerequisite/ml_introduction.ipynb b/tensorflow_v2/notebooks/0_Prerequisite/ml_introduction.ipynb index 226dd66f..0ddf5419 100644 --- a/tensorflow_v2/notebooks/0_Prerequisite/ml_introduction.ipynb +++ b/tensorflow_v2/notebooks/0_Prerequisite/ml_introduction.ipynb @@ -13,7 +13,7 @@ "## Machine Learning\n", "\n", "- [An Introduction to Machine Learning Theory and Its Applications: A Visual Tutorial with Examples](https://www.toptal.com/machine-learning/machine-learning-theory-an-introductory-primer)\n", - "- [A Gentle Guide to Machine Learning](https://blog.monkeylearn.com/a-gentle-guide-to-machine-learning/)\n", + "- [A Gentle Guide to Machine Learning](https://monkeylearn.com/blog/gentle-guide-to-machine-learning/)\n", "- [A Visual Introduction to Machine Learning](http://www.r2d3.us/visual-intro-to-machine-learning-part-1/)\n", "- [Introduction to Machine Learning](http://alex.smola.org/drafts/thebook.pdf)\n", "\n", diff --git a/tensorflow_v2/notebooks/0_Prerequisite/mnist_dataset_intro.ipynb b/tensorflow_v2/notebooks/0_Prerequisite/mnist_dataset_intro.ipynb index f1813c85..93c9e79e 100644 --- a/tensorflow_v2/notebooks/0_Prerequisite/mnist_dataset_intro.ipynb +++ b/tensorflow_v2/notebooks/0_Prerequisite/mnist_dataset_intro.ipynb @@ -1,10 +1,8 @@ { "cells": [ { - "cell_type": "code", - "execution_count": null, + "cell_type": "markdown", "metadata": {}, - "outputs": [], "source": [ "\n", "# MNIST Dataset Introduction\n", @@ -29,9 +27,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# Import MNIST\n", @@ -55,9 +51,7 @@ { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [ "# Get the next 64 images array and labels\n", @@ -88,9 +82,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython2", - "version": "2.7.13" + "version": "2.7.18" } }, "nbformat": 4, - "nbformat_minor": 0 + "nbformat_minor": 1 } diff --git a/tensorflow_v2/notebooks/2_BasicModels/gradient_boosted_trees.ipynb b/tensorflow_v2/notebooks/2_BasicModels/gradient_boosted_trees.ipynb new file mode 100644 index 00000000..cad1d9e8 --- /dev/null +++ b/tensorflow_v2/notebooks/2_BasicModels/gradient_boosted_trees.ipynb @@ -0,0 +1,604 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Gradient Boosted Decision Tree (GBDT)\n", + "Implement a Gradient Boosted Decision Tree (GBDT) with TensorFlow. This example is using the Boston Housing Value dataset as training samples. The example supports both Classification (2 classes: value > $23000 or not) and Regression (raw home value as target).\n", + "\n", + "- Author: Aymeric Damien\n", + "- Project: https://github.com/aymericdamien/TensorFlow-Examples/" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Boston Housing Dataset\n", + "\n", + "**Link:** https://www.cs.toronto.edu/~delve/data/boston/bostonDetail.html\n", + "\n", + "**Description:**\n", + "\n", + "The dataset contains information collected by the U.S Census Service concerning housing in the area of Boston Mass. It was obtained from the StatLib archive (http://lib.stat.cmu.edu/datasets/boston), and has been used extensively throughout the literature to benchmark algorithms. However, these comparisons were primarily done outside of Delve and are thus somewhat suspect. The dataset is small in size with only 506 cases.\n", + "\n", + "The data was originally published by Harrison, D. and Rubinfeld, D.L. `Hedonic prices and the demand for clean air', J. Environ. Economics & Management, vol.5, 81-102, 1978.`\n", + "\n", + "*For the full features list, please see the link above*" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from __future__ import print_function\n", + "\n", + "# Ignore all GPUs (current TF GBDT does not support GPU).\n", + "import os\n", + "os.environ[\"CUDA_VISIBLE_DEVICES\"] = \"\"\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = \"1\"\n", + "\n", + "import tensorflow as tf\n", + "import numpy as np\n", + "import copy" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "# Dataset parameters.\n", + "num_classes = 2 # Total classes: greater or equal to $23,000, or not (See notes below).\n", + "num_features = 13 # data features size.\n", + "\n", + "# Training parameters.\n", + "max_steps = 2000\n", + "batch_size = 256\n", + "learning_rate = 1.0\n", + "l1_regul = 0.0\n", + "l2_regul = 0.1\n", + "\n", + "# GBDT parameters.\n", + "num_batches_per_layer = 1000\n", + "num_trees = 10\n", + "max_depth = 4" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "# Prepare Boston Housing Dataset.\n", + "from tensorflow.keras.datasets import boston_housing\n", + "(x_train, y_train), (x_test, y_test) = boston_housing.load_data()\n", + "\n", + "# For classification purpose, we build 2 classes: price greater or lower than $23,000\n", + "def to_binary_class(y):\n", + " for i, label in enumerate(y):\n", + " if label >= 23.0:\n", + " y[i] = 1\n", + " else:\n", + " y[i] = 0\n", + " return y\n", + "\n", + "y_train_binary = to_binary_class(copy.deepcopy(y_train))\n", + "y_test_binary = to_binary_class(copy.deepcopy(y_test))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### GBDT Classifier" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "# Build the input function.\n", + "train_input_fn = tf.compat.v1.estimator.inputs.numpy_input_fn(\n", + " x={'x': x_train}, y=y_train_binary,\n", + " batch_size=batch_size, num_epochs=None, shuffle=True)\n", + "test_input_fn = tf.compat.v1.estimator.inputs.numpy_input_fn(\n", + " x={'x': x_test}, y=y_test_binary,\n", + " batch_size=batch_size, num_epochs=1, shuffle=False)\n", + "test_train_input_fn = tf.compat.v1.estimator.inputs.numpy_input_fn(\n", + " x={'x': x_train}, y=y_train_binary,\n", + " batch_size=batch_size, num_epochs=1, shuffle=False)\n", + "# GBDT Models from TF Estimator requires 'feature_column' data format.\n", + "feature_columns = [tf.feature_column.numeric_column(key='x', shape=(num_features,))]" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "INFO:tensorflow:Using default config.\n", + "WARNING:tensorflow:Using temporary folder as model directory: /tmp/tmp5h6BoR\n", + "INFO:tensorflow:Using config: {'_save_checkpoints_secs': 600, '_num_ps_replicas': 0, '_keep_checkpoint_max': 5, '_task_type': 'worker', '_global_id_in_cluster': 0, '_is_chief': True, '_cluster_spec': ClusterSpec({}), '_model_dir': '/tmp/tmp5h6BoR', '_protocol': None, '_save_checkpoints_steps': None, '_keep_checkpoint_every_n_hours': 10000, '_service': None, '_session_config': allow_soft_placement: true\n", + "graph_options {\n", + " rewrite_options {\n", + " meta_optimizer_iterations: ONE\n", + " }\n", + "}\n", + ", '_tf_random_seed': None, '_save_summary_steps': 100, '_device_fn': None, '_session_creation_timeout_secs': 7200, '_experimental_distribute': None, '_num_worker_replicas': 1, '_task_id': 0, '_log_step_count_steps': 100, '_experimental_max_worker_delay_secs': None, '_evaluation_master': '', '_eval_distribute': None, '_train_distribute': None, '_master': ''}\n" + ] + } + ], + "source": [ + "gbdt_classifier = tf.estimator.BoostedTreesClassifier(\n", + " n_batches_per_layer=num_batches_per_layer,\n", + " feature_columns=feature_columns, \n", + " n_classes=num_classes,\n", + " learning_rate=learning_rate, \n", + " n_trees=num_trees,\n", + " max_depth=max_depth,\n", + " l1_regularization=l1_regul, \n", + " l2_regularization=l2_regul\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "WARNING:tensorflow:From /home/user/anaconda3/envs/py2/lib/python2.7/site-packages/tensorflow_core/python/ops/resource_variable_ops.py:1635: calling __init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.\n", + "Instructions for updating:\n", + "If using Keras pass *_constraint arguments to layers.\n", + "WARNING:tensorflow:From /home/user/anaconda3/envs/py2/lib/python2.7/site-packages/tensorflow_core/python/training/training_util.py:236: initialized_value (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version.\n", + "Instructions for updating:\n", + "Use Variable.read_value. Variables in 2.X are initialized automatically both in eager and graph (inside tf.defun) contexts.\n", + "WARNING:tensorflow:From /home/user/anaconda3/envs/py2/lib/python2.7/site-packages/tensorflow_estimator/python/estimator/inputs/queues/feeding_queue_runner.py:62: __init__ (from tensorflow.python.training.queue_runner_impl) is deprecated and will be removed in a future version.\n", + "Instructions for updating:\n", + "To construct input pipelines, use the `tf.data` module.\n", + "WARNING:tensorflow:From /home/user/anaconda3/envs/py2/lib/python2.7/site-packages/tensorflow_estimator/python/estimator/inputs/queues/feeding_functions.py:500: add_queue_runner (from tensorflow.python.training.queue_runner_impl) is deprecated and will be removed in a future version.\n", + "Instructions for updating:\n", + "To construct input pipelines, use the `tf.data` module.\n", + "INFO:tensorflow:Calling model_fn.\n", + "INFO:tensorflow:Done calling model_fn.\n", + "INFO:tensorflow:Create CheckpointSaverHook.\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:Graph was finalized.\n", + "INFO:tensorflow:Running local_init_op.\n", + "INFO:tensorflow:Done running local_init_op.\n", + "WARNING:tensorflow:From /home/user/anaconda3/envs/py2/lib/python2.7/site-packages/tensorflow_core/python/training/monitored_session.py:906: start_queue_runners (from tensorflow.python.training.queue_runner_impl) is deprecated and will be removed in a future version.\n", + "Instructions for updating:\n", + "To construct input pipelines, use the `tf.data` module.\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:Saving checkpoints for 0 into /tmp/tmp5h6BoR/model.ckpt.\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:loss = 0.6931475, step = 0\n", + "WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 0 vs previous value: 0. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.\n", + "WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 0 vs previous value: 0. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.\n", + "WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 0 vs previous value: 0. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.\n", + "WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 0 vs previous value: 0. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.\n", + "WARNING:tensorflow:It seems that global step (tf.train.get_global_step) has not been increased. Current value (could be stable): 0 vs previous value: 0. You could increase the global step by passing tf.train.get_global_step() to Optimizer.apply_gradients or Optimizer.minimize.\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.406 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.156 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.167 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.156 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.161 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.156 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.154 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.155 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.158 sec)\n", + "INFO:tensorflow:loss = 0.6931475, step = 0 (0.150 sec)\n", + "INFO:tensorflow:global_step/sec: 47.2392\n", + "INFO:tensorflow:loss = 0.6931475, step = 100 (0.301 sec)\n", + "INFO:tensorflow:global_step/sec: 605.484\n", + "INFO:tensorflow:loss = 0.6931475, step = 200 (0.165 sec)\n", + "INFO:tensorflow:global_step/sec: 616.234\n", + "INFO:tensorflow:loss = 0.6931475, step = 300 (0.162 sec)\n", + "INFO:tensorflow:global_step/sec: 607.741\n", + "INFO:tensorflow:loss = 0.6931475, step = 400 (0.165 sec)\n", + "INFO:tensorflow:global_step/sec: 591.803\n", + "INFO:tensorflow:loss = 0.6931475, step = 500 (0.170 sec)\n", + "INFO:tensorflow:global_step/sec: 627.369\n", + "INFO:tensorflow:loss = 0.6931475, step = 600 (0.159 sec)\n", + "INFO:tensorflow:global_step/sec: 617.083\n", + "INFO:tensorflow:loss = 0.6931475, step = 700 (0.162 sec)\n", + "INFO:tensorflow:global_step/sec: 608.765\n", + "INFO:tensorflow:loss = 0.6931475, step = 800 (0.164 sec)\n", + "INFO:tensorflow:global_step/sec: 619.62\n", + "INFO:tensorflow:loss = 0.6931475, step = 900 (0.161 sec)\n", + "INFO:tensorflow:global_step/sec: 582.581\n", + "INFO:tensorflow:loss = 0.44474202, step = 1000 (0.172 sec)\n", + "INFO:tensorflow:global_step/sec: 587.127\n", + "INFO:tensorflow:loss = 0.46633375, step = 1100 (0.170 sec)\n", + "INFO:tensorflow:global_step/sec: 583.294\n", + "INFO:tensorflow:loss = 0.45393157, step = 1200 (0.171 sec)\n", + "INFO:tensorflow:global_step/sec: 590.375\n", + "INFO:tensorflow:loss = 0.44438446, step = 1300 (0.170 sec)\n", + "INFO:tensorflow:global_step/sec: 572.479\n", + "INFO:tensorflow:loss = 0.4523462, step = 1400 (0.175 sec)\n", + "INFO:tensorflow:global_step/sec: 580.282\n", + "INFO:tensorflow:loss = 0.4581305, step = 1500 (0.172 sec)\n", + "INFO:tensorflow:global_step/sec: 570.032\n", + "INFO:tensorflow:loss = 0.45298833, step = 1600 (0.175 sec)\n", + "INFO:tensorflow:global_step/sec: 615.6\n", + "INFO:tensorflow:loss = 0.4474975, step = 1700 (0.162 sec)\n", + "INFO:tensorflow:global_step/sec: 603.042\n", + "INFO:tensorflow:loss = 0.47046587, step = 1800 (0.166 sec)\n", + "INFO:tensorflow:global_step/sec: 598.262\n", + "INFO:tensorflow:loss = 0.46371317, step = 1900 (0.167 sec)\n", + "INFO:tensorflow:global_step/sec: 591.323\n", + "INFO:tensorflow:Saving checkpoints for 2000 into /tmp/tmp5h6BoR/model.ckpt.\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:Loss for final step: 0.46488184.\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "gbdt_classifier.train(train_input_fn, max_steps=max_steps)" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "INFO:tensorflow:Calling model_fn.\n", + "WARNING:tensorflow:From /home/user/anaconda3/envs/py2/lib/python2.7/site-packages/tensorflow_core/python/ops/metrics_impl.py:2029: div (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", + "Instructions for updating:\n", + "Deprecated in favor of operator or tf.math.divide.\n", + "WARNING:tensorflow:From /home/user/anaconda3/envs/py2/lib/python2.7/site-packages/tensorflow_estimator/python/estimator/canned/head.py:619: auc (from tensorflow.python.ops.metrics_impl) is deprecated and will be removed in a future version.\n", + "Instructions for updating:\n", + "The value of AUC returned by this may race with the update so this is deprected. Please use tf.keras.metrics.AUC instead.\n", + "WARNING:tensorflow:Trapezoidal rule is known to produce incorrect PR-AUCs; please switch to \"careful_interpolation\" instead.\n", + "WARNING:tensorflow:Trapezoidal rule is known to produce incorrect PR-AUCs; please switch to \"careful_interpolation\" instead.\n", + "INFO:tensorflow:Done calling model_fn.\n", + "INFO:tensorflow:Starting evaluation at 2020-07-15T00:50:36Z\n", + "INFO:tensorflow:Graph was finalized.\n", + "INFO:tensorflow:Restoring parameters from /tmp/tmp5h6BoR/model.ckpt-2000\n", + "INFO:tensorflow:Running local_init_op.\n", + "INFO:tensorflow:Done running local_init_op.\n", + "INFO:tensorflow:Inference Time : 0.56490s\n", + "INFO:tensorflow:Finished evaluation at 2020-07-15-00:50:37\n", + "INFO:tensorflow:Saving dict for global step 2000: accuracy = 0.87376237, accuracy_baseline = 0.63118815, auc = 0.92280567, auc_precision_recall = 0.9104949, average_loss = 0.38236493, global_step = 2000, label/mean = 0.36881188, loss = 0.38619137, precision = 0.8888889, prediction/mean = 0.378958, recall = 0.7516779\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:Saving 'checkpoint_path' summary for global step 2000: /tmp/tmp5h6BoR/model.ckpt-2000\n" + ] + }, + { + "data": { + "text/plain": [ + "{'accuracy': 0.87376237,\n", + " 'accuracy_baseline': 0.63118815,\n", + " 'auc': 0.92280567,\n", + " 'auc_precision_recall': 0.9104949,\n", + " 'average_loss': 0.38236493,\n", + " 'global_step': 2000,\n", + " 'label/mean': 0.36881188,\n", + " 'loss': 0.38619137,\n", + " 'precision': 0.8888889,\n", + " 'prediction/mean': 0.378958,\n", + " 'recall': 0.7516779}" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "gbdt_classifier.evaluate(test_train_input_fn)" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "INFO:tensorflow:Calling model_fn.\n", + "WARNING:tensorflow:Trapezoidal rule is known to produce incorrect PR-AUCs; please switch to \"careful_interpolation\" instead.\n", + "WARNING:tensorflow:Trapezoidal rule is known to produce incorrect PR-AUCs; please switch to \"careful_interpolation\" instead.\n", + "INFO:tensorflow:Done calling model_fn.\n", + "INFO:tensorflow:Starting evaluation at 2020-07-15T00:50:38Z\n", + "INFO:tensorflow:Graph was finalized.\n", + "INFO:tensorflow:Restoring parameters from /tmp/tmp5h6BoR/model.ckpt-2000\n", + "INFO:tensorflow:Running local_init_op.\n", + "INFO:tensorflow:Done running local_init_op.\n", + "INFO:tensorflow:Inference Time : 0.56883s\n", + "INFO:tensorflow:Finished evaluation at 2020-07-15-00:50:38\n", + "INFO:tensorflow:Saving dict for global step 2000: accuracy = 0.78431374, accuracy_baseline = 0.5588235, auc = 0.8458089, auc_precision_recall = 0.86285317, average_loss = 0.49404, global_step = 2000, label/mean = 0.44117647, loss = 0.49404, precision = 0.87096775, prediction/mean = 0.37467176, recall = 0.6\n", + "INFO:tensorflow:Saving 'checkpoint_path' summary for global step 2000: /tmp/tmp5h6BoR/model.ckpt-2000\n" + ] + }, + { + "data": { + "text/plain": [ + "{'accuracy': 0.78431374,\n", + " 'accuracy_baseline': 0.5588235,\n", + " 'auc': 0.8458089,\n", + " 'auc_precision_recall': 0.86285317,\n", + " 'average_loss': 0.49404,\n", + " 'global_step': 2000,\n", + " 'label/mean': 0.44117647,\n", + " 'loss': 0.49404,\n", + " 'precision': 0.87096775,\n", + " 'prediction/mean': 0.37467176,\n", + " 'recall': 0.6}" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "gbdt_classifier.evaluate(test_input_fn)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### GBDT Regressor" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "# Build the input function.\n", + "train_input_fn = tf.compat.v1.estimator.inputs.numpy_input_fn(\n", + " x={'x': x_train}, y=y_train,\n", + " batch_size=batch_size, num_epochs=None, shuffle=True)\n", + "test_input_fn = tf.compat.v1.estimator.inputs.numpy_input_fn(\n", + " x={'x': x_test}, y=y_test,\n", + " batch_size=batch_size, num_epochs=1, shuffle=False)\n", + "# GBDT Models from TF Estimator requires 'feature_column' data format.\n", + "feature_columns = [tf.feature_column.numeric_column(key='x', shape=(num_features,))]" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "INFO:tensorflow:Using default config.\n", + "WARNING:tensorflow:Using temporary folder as model directory: /tmp/tmpts3Kmu\n", + "INFO:tensorflow:Using config: {'_save_checkpoints_secs': 600, '_num_ps_replicas': 0, '_keep_checkpoint_max': 5, '_task_type': 'worker', '_global_id_in_cluster': 0, '_is_chief': True, '_cluster_spec': ClusterSpec({}), '_model_dir': '/tmp/tmpts3Kmu', '_protocol': None, '_save_checkpoints_steps': None, '_keep_checkpoint_every_n_hours': 10000, '_service': None, '_session_config': allow_soft_placement: true\n", + "graph_options {\n", + " rewrite_options {\n", + " meta_optimizer_iterations: ONE\n", + " }\n", + "}\n", + ", '_tf_random_seed': None, '_save_summary_steps': 100, '_device_fn': None, '_session_creation_timeout_secs': 7200, '_experimental_distribute': None, '_num_worker_replicas': 1, '_task_id': 0, '_log_step_count_steps': 100, '_experimental_max_worker_delay_secs': None, '_evaluation_master': '', '_eval_distribute': None, '_train_distribute': None, '_master': ''}\n" + ] + } + ], + "source": [ + "gbdt_regressor = tf.estimator.BoostedTreesRegressor(\n", + " n_batches_per_layer=num_batches_per_layer,\n", + " feature_columns=feature_columns, \n", + " learning_rate=learning_rate, \n", + " n_trees=num_trees,\n", + " max_depth=max_depth,\n", + " l1_regularization=l1_regul, \n", + " l2_regularization=l2_regul\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "INFO:tensorflow:Calling model_fn.\n", + "INFO:tensorflow:Done calling model_fn.\n", + "INFO:tensorflow:Create CheckpointSaverHook.\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:Graph was finalized.\n", + "INFO:tensorflow:Running local_init_op.\n", + "INFO:tensorflow:Done running local_init_op.\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:Saving checkpoints for 0 into /tmp/tmpts3Kmu/model.ckpt.\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:loss = 584.82294, step = 0\n", + "INFO:tensorflow:loss = 560.2794, step = 0 (0.369 sec)\n", + "INFO:tensorflow:loss = 606.68115, step = 0 (0.156 sec)\n", + "INFO:tensorflow:loss = 583.2771, step = 0 (0.155 sec)\n", + "INFO:tensorflow:loss = 603.4647, step = 0 (0.160 sec)\n", + "INFO:tensorflow:loss = 605.8213, step = 0 (0.153 sec)\n", + "INFO:tensorflow:loss = 577.5599, step = 0 (0.157 sec)\n", + "INFO:tensorflow:loss = 585.297, step = 0 (0.157 sec)\n", + "INFO:tensorflow:loss = 545.26074, step = 0 (0.156 sec)\n", + "INFO:tensorflow:loss = 597.91046, step = 0 (0.190 sec)\n", + "INFO:tensorflow:loss = 600.55396, step = 0 (0.174 sec)\n", + "INFO:tensorflow:global_step/sec: 47.5449\n", + "INFO:tensorflow:loss = 539.62646, step = 100 (0.280 sec)\n", + "INFO:tensorflow:global_step/sec: 592.267\n", + "INFO:tensorflow:loss = 573.9592, step = 200 (0.169 sec)\n", + "INFO:tensorflow:global_step/sec: 573.943\n", + "INFO:tensorflow:loss = 617.79407, step = 300 (0.175 sec)\n", + "INFO:tensorflow:global_step/sec: 583.88\n", + "INFO:tensorflow:loss = 593.62915, step = 400 (0.171 sec)\n", + "INFO:tensorflow:global_step/sec: 595.888\n", + "INFO:tensorflow:loss = 594.5435, step = 500 (0.168 sec)\n", + "INFO:tensorflow:global_step/sec: 610.997\n", + "INFO:tensorflow:loss = 579.5427, step = 600 (0.163 sec)\n", + "INFO:tensorflow:global_step/sec: 625.07\n", + "INFO:tensorflow:loss = 555.19604, step = 700 (0.160 sec)\n", + "INFO:tensorflow:global_step/sec: 674.427\n", + "INFO:tensorflow:loss = 585.61127, step = 800 (0.149 sec)\n", + "INFO:tensorflow:global_step/sec: 652.597\n", + "INFO:tensorflow:loss = 645.147, step = 900 (0.153 sec)\n", + "INFO:tensorflow:global_step/sec: 656.608\n", + "INFO:tensorflow:loss = 65.438034, step = 1000 (0.152 sec)\n", + "INFO:tensorflow:global_step/sec: 660.171\n", + "INFO:tensorflow:loss = 57.25811, step = 1100 (0.151 sec)\n", + "INFO:tensorflow:global_step/sec: 676.676\n", + "INFO:tensorflow:loss = 70.39737, step = 1200 (0.148 sec)\n", + "INFO:tensorflow:global_step/sec: 664.916\n", + "INFO:tensorflow:loss = 63.969463, step = 1300 (0.150 sec)\n", + "INFO:tensorflow:global_step/sec: 679.204\n", + "INFO:tensorflow:loss = 55.910896, step = 1400 (0.147 sec)\n", + "INFO:tensorflow:global_step/sec: 680.936\n", + "INFO:tensorflow:loss = 58.16027, step = 1500 (0.147 sec)\n", + "INFO:tensorflow:global_step/sec: 670.412\n", + "INFO:tensorflow:loss = 66.20054, step = 1600 (0.149 sec)\n", + "INFO:tensorflow:global_step/sec: 673.441\n", + "INFO:tensorflow:loss = 52.643417, step = 1700 (0.149 sec)\n", + "INFO:tensorflow:global_step/sec: 684.782\n", + "INFO:tensorflow:loss = 59.981026, step = 1800 (0.145 sec)\n", + "INFO:tensorflow:global_step/sec: 684.191\n", + "INFO:tensorflow:loss = 65.427055, step = 1900 (0.146 sec)\n", + "INFO:tensorflow:global_step/sec: 683.812\n", + "INFO:tensorflow:Saving checkpoints for 2000 into /tmp/tmpts3Kmu/model.ckpt.\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:Loss for final step: 42.740192.\n" + ] + }, + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "gbdt_regressor.train(train_input_fn, max_steps=max_steps)" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "INFO:tensorflow:Calling model_fn.\n", + "INFO:tensorflow:Done calling model_fn.\n", + "INFO:tensorflow:Starting evaluation at 2020-07-15T00:50:45Z\n", + "INFO:tensorflow:Graph was finalized.\n", + "INFO:tensorflow:Restoring parameters from /tmp/tmpts3Kmu/model.ckpt-2000\n", + "INFO:tensorflow:Running local_init_op.\n", + "INFO:tensorflow:Done running local_init_op.\n", + "INFO:tensorflow:Inference Time : 0.24467s\n", + "INFO:tensorflow:Finished evaluation at 2020-07-15-00:50:45\n", + "INFO:tensorflow:Saving dict for global step 2000: average_loss = 30.202602, global_step = 2000, label/mean = 23.078432, loss = 30.202602, prediction/mean = 22.536291\n", + "WARNING:tensorflow:Issue encountered when serializing resources.\n", + "Type is unsupported, or the types of the items don't match field type in CollectionDef. Note this is a warning and probably safe to ignore.\n", + "'_Resource' object has no attribute 'name'\n", + "INFO:tensorflow:Saving 'checkpoint_path' summary for global step 2000: /tmp/tmpts3Kmu/model.ckpt-2000\n" + ] + }, + { + "data": { + "text/plain": [ + "{'average_loss': 30.202602,\n", + " 'global_step': 2000,\n", + " 'label/mean': 23.078432,\n", + " 'loss': 30.202602,\n", + " 'prediction/mean': 22.536291}" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "gbdt_regressor.evaluate(test_input_fn)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 2", + "language": "python", + "name": "python2" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 2 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython2", + "version": "2.7.18" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tensorflow_v2/notebooks/3_NeuralNetworks/neural_network_raw.ipynb b/tensorflow_v2/notebooks/3_NeuralNetworks/neural_network_raw.ipynb index bbec2f13..2e1032ec 100644 --- a/tensorflow_v2/notebooks/3_NeuralNetworks/neural_network_raw.ipynb +++ b/tensorflow_v2/notebooks/3_NeuralNetworks/neural_network_raw.ipynb @@ -180,7 +180,7 @@ " loss = cross_entropy(pred, y)\n", " \n", " # Variables to update, i.e. trainable variables.\n", - " trainable_variables = weights.values() + biases.values()\n", + " trainable_variables = list(weights.values()) + list(biases.values())\n", "\n", " # Compute gradients.\n", " gradients = g.gradient(loss, trainable_variables)\n", @@ -198,36 +198,36 @@ "name": "stdout", "output_type": "stream", "text": [ - "step: 100, loss: 567.292969, accuracy: 0.136719\n", - "step: 200, loss: 398.614929, accuracy: 0.562500\n", - "step: 300, loss: 226.743774, accuracy: 0.753906\n", - "step: 400, loss: 193.384521, accuracy: 0.777344\n", - "step: 500, loss: 138.649963, accuracy: 0.886719\n", - "step: 600, loss: 109.713669, accuracy: 0.898438\n", - "step: 700, loss: 90.397217, accuracy: 0.906250\n", - "step: 800, loss: 104.545380, accuracy: 0.894531\n", - "step: 900, loss: 94.204697, accuracy: 0.890625\n", - "step: 1000, loss: 81.660645, accuracy: 0.906250\n", - "step: 1100, loss: 81.237137, accuracy: 0.902344\n", - "step: 1200, loss: 65.776703, accuracy: 0.925781\n", - "step: 1300, loss: 94.195862, accuracy: 0.910156\n", - "step: 1400, loss: 79.425507, accuracy: 0.917969\n", - "step: 1500, loss: 93.508163, accuracy: 0.914062\n", - "step: 1600, loss: 88.912506, accuracy: 0.917969\n", - "step: 1700, loss: 79.033607, accuracy: 0.929688\n", - "step: 1800, loss: 65.788315, accuracy: 0.898438\n", - "step: 1900, loss: 73.462387, accuracy: 0.937500\n", - "step: 2000, loss: 59.309540, accuracy: 0.917969\n", - "step: 2100, loss: 67.014008, accuracy: 0.917969\n", - "step: 2200, loss: 48.297115, accuracy: 0.949219\n", - "step: 2300, loss: 64.523148, accuracy: 0.910156\n", - "step: 2400, loss: 72.989517, accuracy: 0.925781\n", - "step: 2500, loss: 57.588585, accuracy: 0.929688\n", - "step: 2600, loss: 44.957100, accuracy: 0.960938\n", - "step: 2700, loss: 59.788242, accuracy: 0.937500\n", - "step: 2800, loss: 63.581337, accuracy: 0.937500\n", - "step: 2900, loss: 53.471252, accuracy: 0.941406\n", - "step: 3000, loss: 43.869728, accuracy: 0.949219\n" + "step: 100, loss: 571.445923, accuracy: 0.222656\n", + "step: 200, loss: 405.567535, accuracy: 0.488281\n", + "step: 300, loss: 252.089172, accuracy: 0.660156\n", + "step: 400, loss: 192.252136, accuracy: 0.792969\n", + "step: 500, loss: 129.173553, accuracy: 0.855469\n", + "step: 600, loss: 125.191071, accuracy: 0.859375\n", + "step: 700, loss: 103.346634, accuracy: 0.890625\n", + "step: 800, loss: 120.199402, accuracy: 0.871094\n", + "step: 900, loss: 95.674088, accuracy: 0.890625\n", + "step: 1000, loss: 113.775406, accuracy: 0.878906\n", + "step: 1100, loss: 68.457413, accuracy: 0.941406\n", + "step: 1200, loss: 80.773163, accuracy: 0.914062\n", + "step: 1300, loss: 85.862785, accuracy: 0.902344\n", + "step: 1400, loss: 63.480415, accuracy: 0.949219\n", + "step: 1500, loss: 77.139435, accuracy: 0.910156\n", + "step: 1600, loss: 88.129692, accuracy: 0.933594\n", + "step: 1700, loss: 92.199730, accuracy: 0.906250\n", + "step: 1800, loss: 90.150421, accuracy: 0.886719\n", + "step: 1900, loss: 48.567772, accuracy: 0.949219\n", + "step: 2000, loss: 54.002838, accuracy: 0.941406\n", + "step: 2100, loss: 58.536209, accuracy: 0.933594\n", + "step: 2200, loss: 47.156784, accuracy: 0.949219\n", + "step: 2300, loss: 55.344498, accuracy: 0.949219\n", + "step: 2400, loss: 70.956612, accuracy: 0.925781\n", + "step: 2500, loss: 76.179062, accuracy: 0.917969\n", + "step: 2600, loss: 44.956696, accuracy: 0.929688\n", + "step: 2700, loss: 56.581280, accuracy: 0.941406\n", + "step: 2800, loss: 57.775612, accuracy: 0.937500\n", + "step: 2900, loss: 46.005424, accuracy: 0.960938\n", + "step: 3000, loss: 51.832504, accuracy: 0.953125\n" ] } ], @@ -253,7 +253,7 @@ "name": "stdout", "output_type": "stream", "text": [ - "Test Accuracy: 0.936800\n" + "Test Accuracy: 0.937600\n" ] } ], @@ -280,12 +280,14 @@ "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAADQNJREFUeJzt3W+MVfWdx/HPZylNjPQBWLHEgnQb3bgaAzoaE3AzamxYbYKN1NQHGzbZMH2AZps0ZA1PypMmjemfrU9IpikpJtSWhFbRGBeDGylRGwejBYpQICzMgkAzJgUT0yDfPphDO8W5v3u5/84dv+9XQube8z1/vrnhM+ecOefcnyNCAPL5h7obAFAPwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+IKnP9HNjtrmdEOixiHAr83W057e9wvZB24dtP9nJugD0l9u9t9/2LEmHJD0gaVzSW5Iei4jfF5Zhzw/0WD/2/HdJOhwRRyPiz5J+IWllB+sD0EedhP96SSemvB+vpv0d2yO2x2yPdbAtAF3WyR/8pju0+MRhfUSMShqVOOwHBkkne/5xSQunvP+ipJOdtQOgXzoJ/1uSbrT9JduflfQNSdu70xaAXmv7sD8iLth+XNL/SJolaVNE7O9aZwB6qu1LfW1tjHN+oOf6cpMPgJmL8ANJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaTaHqJbkmwfk3RO0seSLkTEUDeaAtB7HYW/cm9E/LEL6wHQRxz2A0l1Gv6QtMP2Htsj3WgIQH90eti/LCJO2p4v6RXb70XErqkzVL8U+MUADBhHRHdWZG+QdD4ivl+YpzsbA9BQRLiV+do+7Ld9te3PXXot6SuS9rW7PgD91clh/3WSfm370np+HhEvd6UrAD3XtcP+ljbGYT/Qcz0/7AcwsxF+ICnCDyRF+IGkCD+QFOEHkurGU30prFq1qmFtzZo1xWVPnjxZrH/00UfF+pYtW4r1999/v2Ht8OHDxWWRF3t+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0iKR3pbdPTo0Ya1xYsX96+RaZw7d65hbf/+/X3sZLCMj483rD311FPFZcfGxrrdTt/wSC+AIsIPJEX4gaQIP5AU4QeSIvxAUoQfSIrn+VtUemb/tttuKy574MCBYv3mm28u1m+//fZifXh4uGHt7rvvLi574sSJYn3hwoXFeicuXLhQrJ89e7ZYX7BgQdvbPn78eLE+k6/zt4o9P5AU4QeSIvxAUoQfSIrwA0kRfiApwg8k1fR5ftubJH1V0pmIuLWaNk/SLyUtlnRM0qMR8UHTjc3g5/kH2dy5cxvWlixZUlx2z549xfqdd97ZVk+taDZewaFDh4r1ZvdPzJs3r2Ft7dq1xWU3btxYrA+ybj7P/zNJKy6b9qSknRFxo6Sd1XsAM0jT8EfELkkTl01eKWlz9XqzpIe73BeAHmv3nP+6iDglSdXP+d1rCUA/9PzeftsjkkZ6vR0AV6bdPf9p2wskqfp5ptGMETEaEUMRMdTmtgD0QLvh3y5pdfV6taTnu9MOgH5pGn7bz0p6Q9I/2R63/R+SvifpAdt/kPRA9R7ADML39mNgPfLII8X61q1bi/V9+/Y1rN17773FZScmLr/ANXPwvf0Aigg/kBThB5Ii/EBShB9IivADSXGpD7WZP7/8SMjevXs7Wn7VqlUNa9u2bSsuO5NxqQ9AEeEHkiL8QFKEH0iK8ANJEX4gKcIPJMUQ3ahNs6/Pvvbaa4v1Dz4of1v8wYMHr7inTNjzA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBSPM+Pnlq2bFnD2quvvlpcdvbs2cX68PBwsb5r165i/dOK5/kBFBF+ICnCDyRF+IGkCD+QFOEHkiL8QFJNn+e3vUnSVyWdiYhbq2kbJK2RdLaabX1EvNSrJjFzPfjggw1rza7j79y5s1h/44032uoJk1rZ8/9M0opppv8oIpZU/wg+MMM0DX9E7JI00YdeAPRRJ+f8j9v+ne1Ntud2rSMAfdFu+DdK+rKkJZJOSfpBoxltj9gesz3W5rYA9EBb4Y+I0xHxcURclPQTSXcV5h2NiKGIGGq3SQDd11b4bS+Y8vZrkvZ1px0A/dLKpb5nJQ1L+rztcUnfkTRse4mkkHRM0jd72COAHuB5fnTkqquuKtZ3797dsHbLLbcUl73vvvuK9ddff71Yz4rn+QEUEX4gKcIPJEX4gaQIP5AU4QeSYohudGTdunXF+tKlSxvWXn755eKyXMrrLfb8QFKEH0iK8ANJEX4gKcIPJEX4gaQIP5AUj/Si6KGHHirWn3vuuWL9ww8/bFhbsWK6L4X+mzfffLNYx/R4pBdAEeEHkiL8QFKEH0iK8ANJEX4gKcIPJMXz/Mldc801xfrTTz9drM+aNatYf+mlxgM4cx2/Xuz5gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiCpps/z214o6RlJX5B0UdJoRPzY9jxJv5S0WNIxSY9GxAdN1sXz/H3W7Dp8s2vtd9xxR7F+5MiRYr30zH6zZdGebj7Pf0HStyPiZkl3S1pr+58lPSlpZ0TcKGln9R7ADNE0/BFxKiLerl6fk3RA0vWSVkraXM22WdLDvWoSQPdd0Tm/7cWSlkr6raTrIuKUNPkLQtL8bjcHoHdavrff9hxJ2yR9KyL+ZLd0WiHbI5JG2msPQK+0tOe3PVuTwd8SEb+qJp+2vaCqL5B0ZrplI2I0IoYiYqgbDQPojqbh9+Qu/qeSDkTED6eUtktaXb1eLen57rcHoFdaudS3XNJvJO3V5KU+SVqvyfP+rZIWSTou6esRMdFkXVzq67ObbrqpWH/vvfc6Wv/KlSuL9RdeeKGj9ePKtXqpr+k5f0TsltRoZfdfSVMABgd3+AFJEX4gKcIPJEX4gaQIP5AU4QeS4qu7PwVuuOGGhrUdO3Z0tO5169YV6y+++GJH60d92PMDSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFJc5/8UGBlp/C1pixYt6mjdr732WrHe7PsgMLjY8wNJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUlznnwGWL19erD/xxBN96gSfJuz5gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiCpptf5bS+U9IykL0i6KGk0In5se4OkNZLOVrOuj4iXetVoZvfcc0+xPmfOnLbXfeTIkWL9/Pnzba8bg62Vm3wuSPp2RLxt+3OS9th+par9KCK+37v2APRK0/BHxClJp6rX52wfkHR9rxsD0FtXdM5ve7GkpZJ+W0163PbvbG+yPbfBMiO2x2yPddQpgK5qOfy250jaJulbEfEnSRslfVnSEk0eGfxguuUiYjQihiJiqAv9AuiSlsJve7Ymg78lIn4lSRFxOiI+joiLkn4i6a7etQmg25qG37Yl/VTSgYj44ZTpC6bM9jVJ+7rfHoBeaeWv/csk/Zukvbbfqaatl/SY7SWSQtIxSd/sSYfoyLvvvlus33///cX6xMREN9vBAGnlr/27JXmaEtf0gRmMO/yApAg/kBThB5Ii/EBShB9IivADSbmfQyzbZjxnoMciYrpL85/Anh9IivADSRF+ICnCDyRF+IGkCD+QFOEHkur3EN1/lPR/U95/vpo2iAa1t0HtS6K3dnWztxtanbGvN/l8YuP22KB+t9+g9jaofUn01q66euOwH0iK8ANJ1R3+0Zq3XzKovQ1qXxK9tauW3mo95wdQn7r3/ABqUkv4ba+wfdD2YdtP1tFDI7aP2d5r+526hxirhkE7Y3vflGnzbL9i+w/Vz2mHSauptw22/7/67N6x/WBNvS20/b+2D9jeb/s/q+m1fnaFvmr53Pp+2G97lqRDkh6QNC7pLUmPRcTv+9pIA7aPSRqKiNqvCdv+F0nnJT0TEbdW056SNBER36t+cc6NiP8akN42SDpf98jN1YAyC6aOLC3pYUn/rho/u0Jfj6qGz62OPf9dkg5HxNGI+LOkX0haWUMfAy8idkm6fNSMlZI2V683a/I/T9816G0gRMSpiHi7en1O0qWRpWv97Ap91aKO8F8v6cSU9+MarCG/Q9IO23tsj9TdzDSuq4ZNvzR8+vya+7lc05Gb++mykaUH5rNrZ8Trbqsj/NN9xdAgXXJYFhG3S/pXSWurw1u0pqWRm/tlmpGlB0K7I153Wx3hH5e0cMr7L0o6WUMf04qIk9XPM5J+rcEbffj0pUFSq59nau7nrwZp5ObpRpbWAHx2gzTidR3hf0vSjba/ZPuzkr4haXsNfXyC7aurP8TI9tWSvqLBG314u6TV1evVkp6vsZe/MygjNzcaWVo1f3aDNuJ1LTf5VJcy/lvSLEmbIuK7fW9iGrb/UZN7e2nyicef19mb7WclDWvyqa/Tkr4j6TlJWyUtknRc0tcjou9/eGvQ27AmD13/OnLzpXPsPve2XNJvJO2VdLGavF6T59e1fXaFvh5TDZ8bd/gBSXGHH5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpP4CIJjqosJxHysAAAAASUVORK5CYII=\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAAM20lEQVR4nO3dXahc9bnH8d/vpCmI6UXiS9ik0bTBC8tBEo1BSCxbQktOvIjFIM1FyYHi7kWUFkuo2It4WaQv1JvALkrTkmMJpGoQscmJxVDU4o5Es2NIjCGaxLxYIjQRJMY+vdjLso0za8ZZa2ZN8nw/sJmZ9cya9bDMz7VmvczfESEAV77/aroBAINB2IEkCDuQBGEHkiDsQBJfGeTCbHPoH+iziHCr6ZW27LZX2j5o+7Dth6t8FoD+cq/n2W3PkHRI0nckHZf0mqS1EfFWyTxs2YE+68eWfamkwxFxJCIuSPqTpNUVPg9AH1UJ+zxJx6a9Pl5M+xzbY7YnbE9UWBaAivp+gC4ixiWNS+zGA02qsmU/IWn+tNdfL6YBGEJVwv6apJtsf8P2VyV9X9L2etoCULeed+Mj4qLtByT9RdIMSU9GxP7aOgNQq55PvfW0ML6zA33Xl4tqAFw+CDuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJEHYgSQIO5AEYQeSIOxAEoQdSIKwA0kQdiAJwg4kQdiBJAg7kARhB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJ9Dw+uyTZPirpnKRPJV2MiCV1NAWgfpXCXrgrIv5Rw+cA6CN244EkqoY9JO2wvcf2WKs32B6zPWF7ouKyAFTgiOh9ZnteRJywfb2knZIejIjdJe/vfWEAuhIRbjW90pY9Ik4Uj2ckPS1paZXPA9A/PYfd9tW2v/bZc0nflTRZV2MA6lXlaPxcSU/b/uxz/i8iXqilKwC1q/Sd/UsvjO/sQN/15Ts7gMsHYQeSIOxAEoQdSIKwA0nUcSNMCmvWrGlbu//++0vnff/990vrH3/8cWl9y5YtpfVTp061rR0+fLh0XuTBlh1IgrADSRB2IAnCDiRB2IEkCDuQBGEHkuCuty4dOXKkbW3BggWDa6SFc+fOta3t379/gJ0Ml+PHj7etPfbYY6XzTkxcvr+ixl1vQHKEHUiCsANJEHYgCcIOJEHYgSQIO5AE97N3qeye9VtuuaV03gMHDpTWb7755tL6rbfeWlofHR1tW7vjjjtK5z127Fhpff78+aX1Ki5evFha/+CDD0rrIyMjPS/7vffeK61fzufZ22HLDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJcD/7FWD27Nlta4sWLSqdd8+ePaX122+/vZeWutLp9/IPHTpUWu90/cKcOXPa1tavX18676ZNm0rrw6zn+9ltP2n7jO3JadPm2N5p++3isf2/NgBDoZvd+N9LWnnJtIcl7YqImyTtKl4DGGIdwx4RuyWdvWTyakmbi+ebJd1Tb1sA6tbrtfFzI+Jk8fyUpLnt3mh7TNJYj8sBUJPKN8JERJQdeIuIcUnjEgfogCb1eurttO0RSSoez9TXEoB+6DXs2yWtK56vk/RsPe0A6JeO59ltPyVpVNK1kk5L2ijpGUlbJd0g6V1J90XEpQfxWn0Wu/Ho2r333lta37p1a2l9cnKybe2uu+4qnffs2Y7/nIdWu/PsHb+zR8TaNqUVlToCMFBcLgskQdiBJAg7kARhB5Ig7EAS3OKKxlx//fWl9X379lWaf82aNW1r27ZtK533csaQzUByhB1IgrADSRB2IAnCDiRB2IEkCDuQBEM2ozGdfs75uuuuK61/+OGHpfWDBw9+6Z6uZGzZgSQIO5AEYQeSIOxAEoQdSIKwA0kQdiAJ7mdHXy1btqxt7cUXXyydd+bMmaX10dHR0vru3btL61cq7mcHkiPsQBKEHUiCsANJEHYgCcIOJEHYgSS4nx19tWrVqra1TufRd+3aVVp/5ZVXeuopq45bdttP2j5je3LatEdtn7C9t/hr/18UwFDoZjf+95JWtpj+m4hYVPw9X29bAOrWMewRsVvS2QH0AqCPqhyge8D2m8Vu/ux2b7I9ZnvC9kSFZQGoqNewb5K0UNIiSScl/ardGyNiPCKWRMSSHpcFoAY9hT0iTkfEpxHxL0m/k7S03rYA1K2nsNsemfbye5Im270XwHDoeJ7d9lOSRiVda/u4pI2SRm0vkhSSjkr6Uf9axDC76qqrSusrV7Y6kTPlwoULpfNu3LixtP7JJ5+U1vF5HcMeEWtbTH6iD70A6CMulwWSIOxAEoQdSIKwA0kQdiAJbnFFJRs2bCitL168uG3thRdeKJ335Zdf7qkntMaWHUiCsANJEHYgCcIOJEHYgSQIO5AEYQeSYMhmlLr77rtL688880xp/aOPPmpbK7v9VZJeffXV0jpaY8hmIDnCDiRB2IEkCDuQBGEHkiDsQBKEHUiC+9mTu+aaa0rrjz/+eGl9xowZpfXnn28/5ifn0QeLLTuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJMH97Fe4TufBO53rvu2220rr77zzTmm97J71TvOiNz3fz257vu2/2n7L9n7bPy6mz7G90/bbxePsupsGUJ9uduMvSvppRHxL0h2S1tv+lqSHJe2KiJsk7SpeAxhSHcMeEScj4vXi+TlJByTNk7Ra0ubibZsl3dOnHgHU4EtdG297gaTFkv4uaW5EnCxKpyTNbTPPmKSxCj0CqEHXR+Ntz5K0TdJPIuKf02sxdZSv5cG3iBiPiCURsaRSpwAq6SrstmdqKuhbIuLPxeTTtkeK+oikM/1pEUAdOu7G27akJyQdiIhfTyttl7RO0i+Kx2f70iEqWbhwYWm906m1Th566KHSOqfXhkc339mXSfqBpH229xbTHtFUyLfa/qGkdyXd15cOAdSiY9gj4m+SWp6kl7Si3nYA9AuXywJJEHYgCcIOJEHYgSQIO5AEPyV9Bbjxxhvb1nbs2FHpszds2FBaf+655yp9PgaHLTuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJMF59ivA2Fj7X/264YYbKn32Sy+9VFof5E+Roxq27EAShB1IgrADSRB2IAnCDiRB2IEkCDuQBOfZLwPLly8vrT/44IMD6gSXM7bsQBKEHUiCsANJEHYgCcIOJEHYgSQIO5BEN+Ozz5f0B0lzJYWk8Yj4re1HJd0v6YPirY9ExPP9ajSzO++8s7Q+a9asnj+70/jp58+f7/mzMVy6uajmoqSfRsTrtr8maY/tnUXtNxHxy/61B6Au3YzPflLSyeL5OdsHJM3rd2MA6vWlvrPbXiBpsaS/F5MesP2m7Sdtz24zz5jtCdsT1VoFUEXXYbc9S9I2ST+JiH9K2iRpoaRFmtry/6rVfBExHhFLImJJ9XYB9KqrsNueqamgb4mIP0tSRJyOiE8j4l+Sfidpaf/aBFBVx7DbtqQnJB2IiF9Pmz4y7W3fkzRZf3sA6tLN0fhlkn4gaZ/tvcW0RySttb1IU6fjjkr6UR/6Q0VvvPFGaX3FihWl9bNnz9bZDhrUzdH4v0lyixLn1IHLCFfQAUkQdiAJwg4kQdiBJAg7kARhB5LwIIfctc34vkCfRUSrU+Vs2YEsCDuQBGEHkiDsQBKEHUiCsANJEHYgiUEP2fwPSe9Oe31tMW0YDWtvw9qXRG+9qrO3G9sVBnpRzRcWbk8M62/TDWtvw9qXRG+9GlRv7MYDSRB2IImmwz7e8PLLDGtvw9qXRG+9GkhvjX5nBzA4TW/ZAQwIYQeSaCTstlfaPmj7sO2Hm+ihHdtHbe+zvbfp8emKMfTO2J6cNm2O7Z223y4eW46x11Bvj9o+Uay7vbZXNdTbfNt/tf2W7f22f1xMb3TdlfQ1kPU28O/stmdIOiTpO5KOS3pN0tqIeGugjbRh+6ikJRHR+AUYtr8t6bykP0TEfxfTHpN0NiJ+UfyPcnZE/GxIentU0vmmh/EuRisamT7MuKR7JP2vGlx3JX3dpwGstya27EslHY6IIxFxQdKfJK1uoI+hFxG7JV06JMtqSZuL55s19Y9l4Nr0NhQi4mREvF48Pyfps2HGG113JX0NRBNhnyfp2LTXxzVc472HpB2299gea7qZFuZGxMni+SlJc5tspoWOw3gP0iXDjA/Nuutl+POqOED3Rcsj4lZJ/yNpfbG7OpRi6jvYMJ077WoY70FpMcz4fzS57nod/ryqJsJ+QtL8aa+/XkwbChFxong8I+lpDd9Q1Kc/G0G3eDzTcD//MUzDeLcaZlxDsO6aHP68ibC/Jukm29+w/VVJ35e0vYE+vsD21cWBE9m+WtJ3NXxDUW+XtK54vk7Ssw328jnDMox3u2HG1fC6a3z484gY+J+kVZo6Iv+OpJ830UObvr4p6Y3ib3/TvUl6SlO7dZ9o6tjGDyVdI2mXpLcl/b+kOUPU2x8l7ZP0pqaCNdJQb8s1tYv+pqS9xd+qptddSV8DWW9cLgskwQE6IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUji3y9hG/l2EQpSAAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] }, - "metadata": {}, + "metadata": { + "needs_background": "light" + }, "output_type": "display_data" }, { @@ -297,12 +299,14 @@ }, { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAADYNJREFUeJzt3X+oXPWZx/HPZ20CYouaFLMXYzc16rIqauUqiy2LSzW6S0wMWE3wjyy77O0fFbYYfxGECEuwLNvu7l+BFC9NtLVpuDHGWjYtsmoWTPAqGk2TtkauaTbX3A0pNkGkJnn2j3uy3MY7ZyYzZ+bMzfN+QZiZ88w552HI555z5pw5X0eEAOTzJ3U3AKAehB9IivADSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFKf6+XKbHM5IdBlEeFW3tfRlt/2nbZ/Zfs92491siwAveV2r+23fZ6kX0u6XdJBSa9LWhERvyyZhy0/0GW92PLfLOm9iHg/Iv4g6ceSlnawPAA91En4L5X02ymvDxbT/ojtIdujtkc7WBeAinXyhd90uxaf2a2PiPWS1kvs9gP9pJMt/0FJl015PV/Soc7aAdArnYT/dUlX2v6y7dmSlkvaVk1bALqt7d3+iDhh+wFJ2yWdJ2k4IvZU1hmArmr7VF9bK+OYH+i6nlzkA2DmIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+IKme3rob7XnooYdK6+eff37D2nXXXVc67z333NNWT6etW7eutP7aa681rD399NMdrRudYcsPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0lx994+sGnTptJ6p+fi67R///6Gtdtuu6103gMHDlTdTgrcvRdAKcIPJEX4gaQIP5AU4QeSIvxAUoQfSKqj3/PbHpN0TNJJSSciYrCKps41dZ7H37dvX2l9+/btpfXLL7+8tH7XXXeV1hcuXNiwdv/995fO++STT5bW0Zkqbubx1xFxpILlAOghdvuBpDoNf0j6ue03bA9V0RCA3uh0t/+rEXHI9iWSfmF7X0S8OvUNxR8F/jAAfaajLX9EHCoeJyQ9J+nmad6zPiIG+TIQ6C9th9/2Bba/cPq5pEWS3q2qMQDd1clu/zxJz9k+vZwfRcR/VtIVgK5rO/wR8b6k6yvsZcYaHCw/olm2bFlHy9+zZ09pfcmSJQ1rR46Un4U9fvx4aX327Nml9Z07d5bWr7++8X+RuXPnls6L7uJUH5AU4QeSIvxAUoQfSIrwA0kRfiAphuiuwMDAQGm9uBaioWan8u64447S+vj4eGm9E6tWrSqtX3311W0v+8UXX2x7XnSOLT+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJMV5/gq88MILpfUrrriitH7s2LHS+tGjR8+6p6osX768tD5r1qwedYKqseUHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaQ4z98DH3zwQd0tNPTwww+X1q+66qqOlr9r1662aug+tvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kJQjovwN9rCkxZImIuLaYtocSZskLZA0JuneiPhd05XZ5StD5RYvXlxa37x5c2m92RDdExMTpfWy+wG88sorpfOiPRFRPlBEoZUt/w8k3XnGtMckvRQRV0p6qXgNYAZpGv6IeFXSmbeSWSppQ/F8g6S7K+4LQJe1e8w/LyLGJal4vKS6lgD0Qtev7bc9JGmo2+sBcHba3fIftj0gScVjw299ImJ9RAxGxGCb6wLQBe2Gf5uklcXzlZKer6YdAL3SNPy2n5X0mqQ/t33Q9j9I+o6k223/RtLtxWsAM0jTY/6IWNGg9PWKe0EXDA6WH201O4/fzKZNm0rrnMvvX1zhByRF+IGkCD+QFOEHkiL8QFKEH0iKW3efA7Zu3dqwtmjRoo6WvXHjxtL6448/3tHyUR+2/EBShB9IivADSRF+ICnCDyRF+IGkCD+QVNNbd1e6Mm7d3ZaBgYHS+ttvv92wNnfu3NJ5jxw5Ulq/5ZZbSuv79+8vraP3qrx1N4BzEOEHkiL8QFKEH0iK8ANJEX4gKcIPJMXv+WeAkZGR0nqzc/llnnnmmdI65/HPXWz5gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiCppuf5bQ9LWixpIiKuLaY9IekfJf1v8bbVEfGzbjV5rluyZElp/cYbb2x72S+//HJpfc2aNW0vGzNbK1v+H0i6c5rp/xYRNxT/CD4wwzQNf0S8KuloD3oB0EOdHPM/YHu37WHbF1fWEYCeaDf86yQtlHSDpHFJ3230RttDtkdtj7a5LgBd0Fb4I+JwRJyMiFOSvi/p5pL3ro+IwYgYbLdJANVrK/y2p95Odpmkd6tpB0CvtHKq71lJt0r6ou2DktZIutX2DZJC0pikb3axRwBd0DT8EbFimslPdaGXc1az39uvXr26tD5r1qy21/3WW2+V1o8fP972sjGzcYUfkBThB5Ii/EBShB9IivADSRF+IClu3d0Dq1atKq3fdNNNHS1/69atDWv8ZBeNsOUHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaQcEb1bmd27lfWRTz75pLTeyU92JWn+/PkNa+Pj4x0tGzNPRLiV97HlB5Ii/EBShB9IivADSRF+ICnCDyRF+IGk+D3/OWDOnDkNa59++mkPO/msjz76qGGtWW/Nrn+48MIL2+pJki666KLS+oMPPtj2sltx8uTJhrVHH320dN6PP/64kh7Y8gNJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUk3P89u+TNJGSX8q6ZSk9RHxH7bnSNokaYGkMUn3RsTvutcqGtm9e3fdLTS0efPmhrVm9xqYN29eaf2+++5rq6d+9+GHH5bW165dW8l6Wtnyn5C0KiL+QtJfSvqW7aslPSbppYi4UtJLxWsAM0TT8EfEeES8WTw/JmmvpEslLZW0oXjbBkl3d6tJANU7q2N+2wskfUXSLknzImJcmvwDIemSqpsD0D0tX9tv+/OSRiR9OyJ+b7d0mzDZHpI01F57ALqlpS2/7VmaDP4PI2JLMfmw7YGiPiBpYrp5I2J9RAxGxGAVDQOoRtPwe3IT/5SkvRHxvSmlbZJWFs9XSnq++vYAdEvTW3fb/pqkHZLe0eSpPklarcnj/p9I+pKkA5K+ERFHmywr5a27t2zZUlpfunRpjzrJ5cSJEw1rp06dalhrxbZt20rro6OjbS97x44dpfWdO3eW1lu9dXfTY/6I+G9JjRb29VZWAqD/cIUfkBThB5Ii/EBShB9IivADSRF+ICmG6O4DjzzySGm90yG8y1xzzTWl9W7+bHZ4eLi0PjY21tHyR0ZGGtb27dvX0bL7GUN0AyhF+IGkCD+QFOEHkiL8QFKEH0iK8ANJcZ4fOMdwnh9AKcIPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kBThB5Ii/EBShB9Iqmn4bV9m+79s77W9x/Y/FdOfsP0/tt8q/v1t99sFUJWmN/OwPSBpICLetP0FSW9IulvSvZKOR8S/trwybuYBdF2rN/P4XAsLGpc0Xjw/ZnuvpEs7aw9A3c7qmN/2AklfkbSrmPSA7d22h21f3GCeIdujtkc76hRApVq+h5/tz0t6RdLaiNhie56kI5JC0j9r8tDg75ssg91+oMta3e1vKfy2Z0n6qaTtEfG9aeoLJP00Iq5tshzCD3RZZTfwtG1JT0naOzX4xReBpy2T9O7ZNgmgPq182/81STskvSPpVDF5taQVkm7Q5G7/mKRvFl8Oli2LLT/QZZXu9leF8APdx337AZQi/EBShB9IivADSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJNX0Bp4VOyLpgymvv1hM60f92lu/9iXRW7uq7O3PWn1jT3/P/5mV26MRMVhbAyX6tbd+7Uuit3bV1Ru7/UBShB9Iqu7wr695/WX6tbd+7Uuit3bV0lutx/wA6lP3lh9ATWoJv+07bf/K9nu2H6ujh0Zsj9l+pxh5uNYhxoph0CZsvztl2hzbv7D9m+Jx2mHSauqtL0ZuLhlZutbPrt9GvO75br/t8yT9WtLtkg5Kel3Sioj4ZU8bacD2mKTBiKj9nLDtv5J0XNLG06Mh2f4XSUcj4jvFH86LI+LRPuntCZ3lyM1d6q3RyNJ/pxo/uypHvK5CHVv+myW9FxHvR8QfJP1Y0tIa+uh7EfGqpKNnTF4qaUPxfIMm//P0XIPe+kJEjEfEm8XzY5JOjyxd62dX0lct6gj/pZJ+O+X1QfXXkN8h6ee237A9VHcz05h3emSk4vGSmvs5U9ORm3vpjJGl++aza2fE66rVEf7pRhPpp1MOX42IGyX9jaRvFbu3aM06SQs1OYzbuKTv1tlMMbL0iKRvR8Tv6+xlqmn6quVzqyP8ByVdNuX1fEmHauhjWhFxqHickPScJg9T+snh04OkFo8TNffz/yLicEScjIhTkr6vGj+7YmTpEUk/jIgtxeTaP7vp+qrrc6sj/K9LutL2l23PlrRc0rYa+vgM2xcUX8TI9gWSFqn/Rh/eJmll8XylpOdr7OWP9MvIzY1GllbNn12/jXhdy0U+xamMf5d0nqThiFjb8yamYftyTW7tpclfPP6ozt5sPyvpVk3+6uuwpDWStkr6iaQvSTog6RsR0fMv3hr0dqvOcuTmLvXWaGTpXarxs6tyxOtK+uEKPyAnrvADkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaQIP5DU/wG6SwYLYCwMKQAAAABJRU5ErkJggg==\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAANYElEQVR4nO3df6hc9ZnH8c9n3QTEFk0ie7kYWWvUP+KiVq6yuLK41EZXNDEgNUEWS4X0jwoV44+QFSIsouxud/8MpDQ0atemITGNddnUDfXHggleJcZE02oksQk3CdmATRCpSZ79454st3rnzM05Z+ZM8rxfcJmZ88yc8zD6yfk153wdEQJw7vuzthsA0B+EHUiCsANJEHYgCcIOJPHn/VyYbQ79Az0WEZ5seq01u+3bbf/W9ke2l9WZF4DectXz7LbPk/Q7Sd+WtF/SW5IWR8T7JZ9hzQ70WC/W7DdK+igiPo6IP0r6uaQFNeYHoIfqhP0SSb+f8Hp/Me1P2F5ie9T2aI1lAaip5wfoImKVpFUSm/FAm+qs2Q9IunTC69nFNAADqE7Y35J0pe1v2J4uaZGkTc20BaBplTfjI+KE7QclbZZ0nqTVEbGrsc4ANKryqbdKC2OfHei5nvyoBsDZg7ADSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJEHYgST6eitpVPPII4+U1s8///yOtWuuuab0s/fcc0+lnk5buXJlaf3NN9/sWHvuuedqLRtnhjU7kARhB5Ig7EAShB1IgrADSRB2IAnCDiTB3WUHwNq1a0vrdc+Ft2nPnj0da7feemvpZz/55JOm20mBu8sCyRF2IAnCDiRB2IEkCDuQBGEHkiDsQBJcz94HbZ5H3717d2l98+bNpfXLL7+8tH7XXXeV1ufMmdOxdt9995V+9umnny6t48zUCrvtvZKOSTop6UREjDTRFIDmNbFm/7uIONLAfAD0EPvsQBJ1wx6Sfm37bdtLJnuD7SW2R22P1lwWgBrqbsbfHBEHbP+FpFds746I1ye+ISJWSVolcSEM0KZaa/aIOFA8Hpb0oqQbm2gKQPMqh932Bba/fvq5pHmSdjbVGIBm1dmMH5L0ou3T8/mPiPivRro6y4yMlJ9xXLhwYa3579q1q7Q+f/78jrUjR8pPlBw/fry0Pn369NL61q1bS+vXXnttx9qsWbNKP4tmVQ57RHwsqfN/SQADhVNvQBKEHUiCsANJEHYgCcIOJMElrg0YHh4urRenJzvqdmrttttuK62PjY2V1utYunRpaX3u3LmV5/3yyy9X/izOHGt2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiC8+wNeOmll0rrV1xxRWn92LFjpfWjR4+ecU9NWbRoUWl92rRpfeoEdbFmB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkOM/eB/v27Wu7hY4effTR0vpVV11Va/7btm2rVEPzWLMDSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKOiP4tzO7fwiBJuvPOO0vr69atK613G7L58OHDpfWy6+Ffe+210s+imoiYdKCCrmt226ttH7a9c8K0mbZfsf1h8TijyWYBNG8qm/E/lXT7l6Ytk7QlIq6UtKV4DWCAdQ17RLwu6cv3RVogaU3xfI2ku5ttC0DTqv42figiTg8wdlDSUKc32l4iaUnF5QBoSO0LYSIiyg68RcQqSaskDtABbap66u2Q7WFJKh7LD8kCaF3VsG+SdH/x/H5Jv2ymHQC90nUz3vYLkm6RdLHt/ZJWSHpG0i9sPyBpn6Tv9LJJVDcyMlJa73YevZu1a9eW1jmXPji6hj0iFncofavhXgD0ED+XBZIg7EAShB1IgrADSRB2IAluJX0O2LhxY8favHnzas372WefLa0/8cQTteaP/mHNDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJcCvps8Dw8HBp/d133+1YmzVrVulnjxw5Ulq/6aabSut79uwpraP/Kt9KGsC5gbADSRB2IAnCDiRB2IEkCDuQBGEHkuB69rPA+vXrS+vdzqWXef7550vrnEc/d7BmB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkOM8+AObPn19av/766yvP+9VXXy2tr1ixovK8cXbpuma3vdr2Yds7J0x70vYB29uLvzt62yaAuqayGf9TSbdPMv3fI+K64u8/m20LQNO6hj0iXpd0tA+9AOihOgfoHrS9o9jMn9HpTbaX2B61PVpjWQBqqhr2lZLmSLpO0pikH3V6Y0SsioiRiBipuCwADagU9og4FBEnI+KUpB9LurHZtgA0rVLYbU+8t/FCSTs7vRfAYOh6nt32C5JukXSx7f2SVki6xfZ1kkLSXknf712LZ79u15svX768tD5t2rTKy96+fXtp/fjx45XnjbNL17BHxOJJJv+kB70A6CF+LgskQdiBJAg7kARhB5Ig7EASXOLaB0uXLi2t33DDDbXmv3Hjxo41LmHFaazZgSQIO5AEYQeSIOxAEoQdSIKwA0kQdiAJR0T/Fmb3b2ED5PPPPy+t17mEVZJmz57dsTY2NlZr3jj7RIQnm86aHUiCsANJEHYgCcIOJEHYgSQIO5AEYQeS4Hr2c8DMmTM71r744os+dvJVn376acdat966/f7gwgsvrNSTJF100UWl9YcffrjyvKfi5MmTHWuPP/546Wc/++yzSstkzQ4kQdiBJAg7kARhB5Ig7EAShB1IgrADSXCe/RywY8eOtlvoaN26dR1r3a61HxoaKq3fe++9lXoadAcPHiytP/XUU5Xm23XNbvtS27+x/b7tXbZ/WEyfafsV2x8WjzMqdQCgL6ayGX9C0tKImCvpryX9wPZcScskbYmIKyVtKV4DGFBdwx4RYxHxTvH8mKQPJF0iaYGkNcXb1ki6u0c9AmjAGe2z275M0jclbZM0FBGnd7oOSpp0B8v2EklLavQIoAFTPhpv+2uS1kt6KCL+MLEW43etnPRmkhGxKiJGImKkVqcAaplS2G1P03jQfxYRG4rJh2wPF/VhSYd70yKAJnS9lbRta3yf/GhEPDRh+r9I+t+IeMb2MkkzI+KxLvNKeSvpDRs2lNYXLFjQp05yOXHiRMfaqVOnas1706ZNpfXR0dHK837jjTdK61u3bi2td7qV9FT22f9G0j9Ies/29mLacknPSPqF7Qck7ZP0nSnMC0BLuoY9Iv5H0qT/Ukj6VrPtAOgVfi4LJEHYgSQIO5AEYQeSIOxAEgzZPAAee6z05wm1h3Quc/XVV5fWe3kZ6erVq0vre/furTX/9evXd6zt3r271rwHGUM2A8kRdiAJwg4kQdiBJAg7kARhB5Ig7EASnGcHzjGcZweSI+xAEoQdSIKwA0kQdiAJwg4kQdiBJAg7kARhB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkuobd9qW2f2P7fdu7bP+wmP6k7QO2txd/d/S+XQBVdb15he1hScMR8Y7tr0t6W9LdGh+P/XhE/OuUF8bNK4Ce63TziqmMzz4maax4fsz2B5IuabY9AL12Rvvsti+T9E1J24pJD9reYXu17RkdPrPE9qjt0XqtAqhjyvegs/01Sa9JeioiNtgeknREUkj6J41v6n+vyzzYjAd6rNNm/JTCbnuapF9J2hwR/zZJ/TJJv4qIv+oyH8IO9FjlG07atqSfSPpgYtCLA3enLZS0s26TAHpnKkfjb5b0hqT3JJ0qJi+XtFjSdRrfjN8r6fvFwbyyebFmB3qs1mZ8Uwg70HvcNx5IjrADSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJEHYgSQIO5BE1xtONuyIpH0TXl9cTBtEg9rboPYl0VtVTfb2l50Kfb2e/SsLt0cjYqS1BkoMam+D2pdEb1X1qzc244EkCDuQRNthX9Xy8ssMam+D2pdEb1X1pbdW99kB9E/ba3YAfULYgSRaCbvt223/1vZHtpe10UMntvfafq8YhrrV8emKMfQO2945YdpM26/Y/rB4nHSMvZZ6G4hhvEuGGW/1u2t7+PO+77PbPk/S7yR9W9J+SW9JWhwR7/e1kQ5s75U0EhGt/wDD9t9KOi7p2dNDa9n+Z0lHI+KZ4h/KGRHx+ID09qTOcBjvHvXWaZjx76rF767J4c+raGPNfqOkjyLi44j4o6SfS1rQQh8DLyJel3T0S5MXSFpTPF+j8f9Z+q5DbwMhIsYi4p3i+TFJp4cZb/W7K+mrL9oI+yWSfj/h9X4N1njvIenXtt+2vaTtZiYxNGGYrYOShtpsZhJdh/Hupy8NMz4w312V4c/r4gDdV90cEddL+ntJPyg2VwdSjO+DDdK505WS5mh8DMAxST9qs5limPH1kh6KiD9MrLX53U3SV1++tzbCfkDSpRNezy6mDYSIOFA8Hpb0osZ3OwbJodMj6BaPh1vu5/9FxKGIOBkRpyT9WC1+d8Uw4+sl/SwiNhSTW//uJuurX99bG2F/S9KVtr9he7qkRZI2tdDHV9i+oDhwItsXSJqnwRuKepOk+4vn90v6ZYu9/IlBGca70zDjavm7a33484jo+5+kOzR+RH6PpH9so4cOfV0u6d3ib1fbvUl6QeObdV9o/NjGA5JmSdoi6UNJ/y1p5gD19pzGh/beofFgDbfU280a30TfIWl78XdH299dSV99+d74uSyQBAfogCQIO5AEYQeSIOxAEoQdSIKwA0kQdiCJ/wN8jzcem5JvKwAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, - "metadata": {}, + "metadata": { + "needs_background": "light" + }, "output_type": "display_data" }, { @@ -314,12 +318,14 @@ }, { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAADCFJREFUeJzt3WGoXPWZx/Hvs1n7wrQvDDUarGu6RVdLxGS5iBBZXarFFSHmRaUKS2RL0xcNWNgXK76psBREtt1dfFFIaWgqrbVEs2pdbYsspguLGjVU21grcre9a8hVFGoVKSbPvrgn5VbvnLmZOTNnkuf7gTAz55kz52HI7/7PzDlz/pGZSKrnz/puQFI/DL9UlOGXijL8UlGGXyrK8EtFGX6pKMMvFWX4paL+fJobiwhPJ5QmLDNjNc8ba+SPiOsi4lcR8UpE3D7Oa0marhj13P6IWAO8DFwLLADPADdn5i9b1nHklyZsGiP/5cArmflqZv4B+AGwbYzXkzRF44T/POC3yx4vNMv+RETsjIiDEXFwjG1J6tg4X/ittGvxod36zNwN7AZ3+6VZMs7IvwCcv+zxJ4DXxmtH0rSME/5ngAsj4pMR8RHg88DD3bQladJG3u3PzPcjYhfwY2ANsCczf9FZZ5ImauRDfSNtzM/80sRN5SQfSacuwy8VZfilogy/VJThl4oy/FJRhl8qyvBLRRl+qSjDLxVl+KWiDL9UlOGXijL8UlGGXyrK8EtFGX6pKMMvFWX4paIMv1SU4ZeKmuoU3arnoosuGlh76aWXWte97bbbWuv33HPPSD1piSO/VJThl4oy/FJRhl8qyvBLRRl+qSjDLxU11nH+iJgH3gaOAe9n5lwXTen0sWXLloG148ePt667sLDQdTtapouTfP42M9/o4HUkTZG7/VJR44Y/gZ9ExLMRsbOLhiRNx7i7/Vsz87WIWA/8NCJeyswDy5/Q/FHwD4M0Y8Ya+TPzteZ2EdgPXL7Cc3Zn5pxfBkqzZeTwR8TaiPjYifvAZ4EXu2pM0mSNs9t/DrA/Ik68zvcz8/FOupI0cSOHPzNfBS7rsBedhjZv3jyw9s4777Suu3///q7b0TIe6pOKMvxSUYZfKsrwS0UZfqkowy8V5aW7NZZNmza11nft2jWwdu+993bdjk6CI79UlOGXijL8UlGGXyrK8EtFGX6pKMMvFeVxfo3l4osvbq2vXbt2YO3+++/vuh2dBEd+qSjDLxVl+KWiDL9UlOGXijL8UlGGXyoqMnN6G4uY3sY0FU8//XRr/eyzzx5YG3YtgGGX9tbKMjNW8zxHfqkowy8VZfilogy/VJThl4oy/FJRhl8qaujv+SNiD3ADsJiZm5pl64D7gY3APHBTZr41uTbVl40bN7bW5+bmWusvv/zywJrH8fu1mpH/O8B1H1h2O/BEZl4IPNE8lnQKGRr+zDwAvPmBxduAvc39vcCNHfclacJG/cx/TmYeAWhu13fXkqRpmPg1/CJiJ7Bz0tuRdHJGHfmPRsQGgOZ2cdATM3N3Zs5lZvs3Q5KmatTwPwzsaO7vAB7qph1J0zI0/BFxH/A/wF9FxEJEfAG4C7g2In4NXNs8lnQKGfqZPzNvHlD6TMe9aAZdddVVY63/+uuvd9SJuuYZflJRhl8qyvBLRRl+qSjDLxVl+KWinKJbrS699NKx1r/77rs76kRdc+SXijL8UlGGXyrK8EtFGX6pKMMvFWX4paKcoru4K664orX+6KOPttbn5+db61u3bh1Ye++991rX1WicoltSK8MvFWX4paIMv1SU4ZeKMvxSUYZfKsrf8xd3zTXXtNbXrVvXWn/88cdb6x7Ln12O/FJRhl8qyvBLRRl+qSjDLxVl+KWiDL9U1NDj/BGxB7gBWMzMTc2yO4EvAifmX74jM/9zUk1qci677LLW+rDrPezbt6/LdjRFqxn5vwNct8Lyf83Mzc0/gy+dYoaGPzMPAG9OoRdJUzTOZ/5dEfHziNgTEWd11pGkqRg1/N8EPgVsBo4AXx/0xIjYGREHI+LgiNuSNAEjhT8zj2bmscw8DnwLuLzlubszcy4z50ZtUlL3Rgp/RGxY9nA78GI37UialtUc6rsPuBr4eEQsAF8Fro6IzUAC88CXJtijpAnwuv2nuXPPPbe1fujQodb6W2+91Vq/5JJLTronTZbX7ZfUyvBLRRl+qSjDLxVl+KWiDL9UlJfuPs3deuutrfX169e31h977LEOu9EsceSXijL8UlGGXyrK8EtFGX6pKMMvFWX4paI8zn+au+CCC8Zaf9hPenXqcuSXijL8UlGGXyrK8EtFGX6pKMMvFWX4paI8zn+au+GGG8Za/5FHHumoE80aR36pKMMvFWX4paIMv1SU4ZeKMvxSUYZfKmrocf6IOB/4LnAucBzYnZn/HhHrgPuBjcA8cFNm+uPvHlx55ZUDa8Om6FZdqxn53wf+MTMvAa4AvhwRnwZuB57IzAuBJ5rHkk4RQ8OfmUcy87nm/tvAYeA8YBuwt3naXuDGSTUpqXsn9Zk/IjYCW4CngHMy8wgs/YEA2ud9kjRTVn1uf0R8FHgA+Epm/i4iVrveTmDnaO1JmpRVjfwRcQZLwf9eZj7YLD4aERua+gZgcaV1M3N3Zs5l5lwXDUvqxtDwx9IQ/23gcGZ+Y1npYWBHc38H8FD37UmalNXs9m8F/h54ISIONcvuAO4CfhgRXwB+A3xuMi1qmO3btw+srVmzpnXd559/vrV+4MCBkXrS7Bsa/sz8b2DQB/zPdNuOpGnxDD+pKMMvFWX4paIMv1SU4ZeKMvxSUV66+xRw5plnttavv/76kV973759rfVjx46N/NqabY78UlGGXyrK8EtFGX6pKMMvFWX4paIMv1RUZOb0NhYxvY2dRs4444zW+pNPPjmwtri44gWW/uiWW25prb/77rutdc2ezFzVNfYc+aWiDL9UlOGXijL8UlGGXyrK8EtFGX6pKI/zS6cZj/NLamX4paIMv1SU4ZeKMvxSUYZfKsrwS0UNDX9EnB8R/xURhyPiFxFxW7P8zoj4v4g41Pwb/eLxkqZu6Ek+EbEB2JCZz0XEx4BngRuBm4DfZ+a/rHpjnuQjTdxqT/IZOmNPZh4BjjT3346Iw8B547UnqW8n9Zk/IjYCW4CnmkW7IuLnEbEnIs4asM7OiDgYEQfH6lRSp1Z9bn9EfBR4EvhaZj4YEecAbwAJ/DNLHw3+YchruNsvTdhqd/tXFf6IOAP4EfDjzPzGCvWNwI8yc9OQ1zH80oR19sOeiAjg28Dh5cFvvgg8YTvw4sk2Kak/q/m2/0rgZ8ALwPFm8R3AzcBmlnb754EvNV8Otr2WI780YZ3u9nfF8EuT5+/5JbUy/FJRhl8qyvBLRRl+qSjDLxVl+KWiDL9UlOGXijL8UlGGXyrK8EtFGX6pKMMvFTX0Ap4dewP432WPP94sm0Wz2tus9gX2Nqoue7tgtU+c6u/5P7TxiIOZOddbAy1mtbdZ7QvsbVR99eZuv1SU4ZeK6jv8u3vefptZ7W1W+wJ7G1UvvfX6mV9Sf/oe+SX1pJfwR8R1EfGriHglIm7vo4dBImI+Il5oZh7udYqxZhq0xYh4cdmydRHx04j4dXO74jRpPfU2EzM3t8ws3et7N2szXk99tz8i1gAvA9cCC8AzwM2Z+cupNjJARMwDc5nZ+zHhiPgb4PfAd0/MhhQRdwNvZuZdzR/OszLzn2aktzs5yZmbJ9TboJmlb6XH967LGa+70MfIfznwSma+mpl/AH4AbOuhj5mXmQeANz+weBuwt7m/l6X/PFM3oLeZkJlHMvO55v7bwImZpXt971r66kUf4T8P+O2yxwvM1pTfCfwkIp6NiJ19N7OCc07MjNTcru+5nw8aOnPzNH1gZumZee9GmfG6a32Ef6XZRGbpkMPWzPxr4O+ALze7t1qdbwKfYmkatyPA1/tspplZ+gHgK5n5uz57WW6Fvnp53/oI/wJw/rLHnwBe66GPFWXma83tIrCfpY8ps+ToiUlSm9vFnvv5o8w8mpnHMvM48C16fO+amaUfAL6XmQ82i3t/71bqq6/3rY/wPwNcGBGfjIiPAJ8HHu6hjw+JiLXNFzFExFrgs8ze7MMPAzua+zuAh3rs5U/MyszNg2aWpuf3btZmvO7lJJ/mUMa/AWuAPZn5tak3sYKI+EuWRntY+sXj9/vsLSLuA65m6VdfR4GvAv8B/BD4C+A3wOcyc+pfvA3o7WpOcubmCfU2aGbpp+jxvetyxutO+vEMP6kmz/CTijL8UlGGXyrK8EtFGX6pKMMvFWX4paIMv1TU/wNPnZK3k8+kHgAAAABJRU5ErkJggg==\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAAMEElEQVR4nO3dXYhc5R3H8d+vabwwepFUE4OKsRJRUUzKIoKhWnzBBiHmRoxQEiqsFwYi9KJiLxRKQaTaCy+EFcU0WF+IBqPWaBrEtDeaVVNNfIlWIiasWSWCb4g1+fdiT8oad85s5pwzZ9z/9wPLzDzPnDl/DvnlOXNe5nFECMDM95O2CwDQH4QdSIKwA0kQdiAJwg4k8dN+rsw2h/6BhkWEp2qvNLLbvtr2u7bft31rlc8C0Cz3ep7d9ixJeyRdKWmfpB2SVkXEWyXLMLIDDWtiZL9I0vsR8UFEfCvpUUkrKnwegAZVCfupkj6a9Hpf0fY9todtj9oerbAuABU1foAuIkYkjUjsxgNtqjKy75d0+qTXpxVtAAZQlbDvkLTY9pm2j5N0vaTN9ZQFoG4978ZHxHe210p6XtIsSQ9GxO7aKgNQq55PvfW0Mr6zA41r5KIaAD8ehB1IgrADSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJEHYgSQIO5AEYQeSIOxAEoQdSIKwA0kQdiAJwg4k0dcpm5HP2Wef3bHvnXfeKV123bp1pf333ntvTzVlxcgOJEHYgSQIO5AEYQeSIOxAEoQdSIKwA0lwnh2NWrp0ace+w4cPly67b9++ustJrVLYbe+V9IWkQ5K+i4ihOooCUL86RvZfRcSnNXwOgAbxnR1IomrYQ9ILtl+1PTzVG2wP2x61PVpxXQAqqLobvywi9tueL2mr7XciYvvkN0TEiKQRSbIdFdcHoEeVRvaI2F88jkvaJOmiOooCUL+ew257ju0TjzyXdJWkXXUVBqBeVXbjF0jaZPvI5/wtIrbUUhVmjCVLlnTs++qrr0qX3bRpU83V5NZz2CPiA0kX1lgLgAZx6g1IgrADSRB2IAnCDiRB2IEkuMUVlZx//vml/WvXru3Yt2HDhrrLQQlGdiAJwg4kQdiBJAg7kARhB5Ig7EAShB1IgvPsqOScc84p7Z8zZ07Hvscee6zuclCCkR1IgrADSRB2IAnCDiRB2IEkCDuQBGEHknBE/yZpYUaYmeeVV14p7T/55JM79nW7F77bT01jahHhqdoZ2YEkCDuQBGEHkiDsQBKEHUiCsANJEHYgCe5nR6lFixaV9g8NDZX279mzp2Mf59H7q+vIbvtB2+O2d01qm2d7q+33ise5zZYJoKrp7MY/JOnqo9pulbQtIhZL2la8BjDAuoY9IrZLOnhU8wpJ64vn6yVdW29ZAOrW63f2BRExVjz/WNKCTm+0PSxpuMf1AKhJ5QN0ERFlN7hExIikEYkbYYA29Xrq7YDthZJUPI7XVxKAJvQa9s2SVhfPV0t6qp5yADSl62687UckXSbpJNv7JN0u6U5Jj9u+UdKHkq5rski059JLL620/CeffFJTJaiqa9gjYlWHrstrrgVAg7hcFkiCsANJEHYgCcIOJEHYgSS4xRWlLrjggkrL33XXXTVVgqoY2YEkCDuQBGEHkiDsQBKEHUiCsANJEHYgCaZsTu7iiy8u7X/22WdL+/fu3Vvaf8kll3Ts++abb0qXRW+YshlIjrADSRB2IAnCDiRB2IEkCDuQBGEHkuB+9uSuuOKK0v558+aV9m/ZsqW0n3Ppg4ORHUiCsANJEHYgCcIOJEHYgSQIO5AEYQeS4Dx7chdeeGFpf7ffO9i4cWOd5aBBXUd22w/aHre9a1LbHbb3295Z/C1vtkwAVU1nN/4hSVdP0f6XiFhS/P293rIA1K1r2CNiu6SDfagFQIOqHKBba/uNYjd/bqc32R62PWp7tMK6AFTUa9jvk3SWpCWSxiTd3emNETESEUMRMdTjugDUoKewR8SBiDgUEYcl3S/ponrLAlC3nsJue+Gklysl7er0XgCDoevvxtt+RNJlkk6SdEDS7cXrJZJC0l5JN0XEWNeV8bvxfXfKKaeU9u/cubO0/7PPPivtP/fcc4+1JDSs0+/Gd72oJiJWTdH8QOWKAPQVl8sCSRB2IAnCDiRB2IEkCDuQBLe4znBr1qwp7Z8/f35p/3PPPVdjNWgTIzuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJMF59hnujDPOqLR8t1tc8ePByA4kQdiBJAg7kARhB5Ig7EAShB1IgrADSXCefYa75pprKi3/9NNP11QJ2sbIDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJcJ59Bli2bFnHvm5TNiOPriO77dNtv2j7Ldu7ba8r2ufZ3mr7veJxbvPlAujVdHbjv5P0u4g4T9LFkm62fZ6kWyVti4jFkrYVrwEMqK5hj4ixiHiteP6FpLclnSpphaT1xdvWS7q2oRoB1OCYvrPbXiRpqaSXJS2IiLGi62NJCzosMyxpuEKNAGow7aPxtk+Q9ISkWyLi88l9ERGSYqrlImIkIoYiYqhSpQAqmVbYbc/WRNAfjogni+YDthcW/QsljTdTIoA6dN2Nt21JD0h6OyLumdS1WdJqSXcWj081UiG6WrlyZce+WbNmlS77+uuvl/Zv3769p5oweKbznf0SSb+R9KbtnUXbbZoI+eO2b5T0oaTrGqkQQC26hj0i/iXJHbovr7ccAE3hclkgCcIOJEHYgSQIO5AEYQeS4BbXH4Hjjz++tH/58uU9f/bGjRtL+w8dOtTzZ2OwMLIDSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKe+JGZPq3M7t/KZpDZs2eX9r/00ksd+8bHy39T5IYbbijt//rrr0v7MXgiYsq7VBnZgSQIO5AEYQeSIOxAEoQdSIKwA0kQdiAJzrMDMwzn2YHkCDuQBGEHkiDsQBKEHUiCsANJEHYgia5ht3267Rdtv2V7t+11Rfsdtvfb3ln89f7j5QAa1/WiGtsLJS2MiNdsnyjpVUnXamI+9i8j4s/TXhkX1QCN63RRzXTmZx+TNFY8/8L225JOrbc8AE07pu/sthdJWirp5aJpre03bD9oe26HZYZtj9oerVYqgCqmfW287RMkvSTpTxHxpO0Fkj6VFJL+qIld/d92+Qx244GGddqNn1bYbc+W9Iyk5yPinin6F0l6JiLO7/I5hB1oWM83wti2pAckvT056MWBuyNWStpVtUgAzZnO0fhlkv4p6U1Jh4vm2yStkrREE7vxeyXdVBzMK/ssRnagYZV24+tC2IHmcT87kBxhB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJEHYgia4/OFmzTyV9OOn1SUXbIBrU2ga1LonaelVnbWd06ujr/ew/WLk9GhFDrRVQYlBrG9S6JGrrVb9qYzceSIKwA0m0HfaRltdfZlBrG9S6JGrrVV9qa/U7O4D+aXtkB9AnhB1IopWw277a9ru237d9axs1dGJ7r+03i2moW52frphDb9z2rklt82xvtf1e8TjlHHst1TYQ03iXTDPe6rZre/rzvn9ntz1L0h5JV0raJ2mHpFUR8VZfC+nA9l5JQxHR+gUYtn8p6UtJfz0ytZbtuyQdjIg7i/8o50bE7wektjt0jNN4N1Rbp2nG16jFbVfn9Oe9aGNkv0jS+xHxQUR8K+lRSStaqGPgRcR2SQePal4haX3xfL0m/rH0XYfaBkJEjEXEa8XzLyQdmWa81W1XUldftBH2UyV9NOn1Pg3WfO8h6QXbr9oebruYKSyYNM3Wx5IWtFnMFLpO491PR00zPjDbrpfpz6viAN0PLYuIX0j6taSbi93VgRQT38EG6dzpfZLO0sQcgGOS7m6zmGKa8Sck3RIRn0/ua3PbTVFXX7ZbG2HfL+n0Sa9PK9oGQkTsLx7HJW3SxNeOQXLgyAy6xeN4y/X8X0QciIhDEXFY0v1qcdsV04w/IenhiHiyaG59201VV7+2Wxth3yFpse0zbR8n6XpJm1uo4wdszykOnMj2HElXafCmot4saXXxfLWkp1qs5XsGZRrvTtOMq+Vt1/r05xHR9z9JyzVxRP4/kv7QRg0d6vq5pH8Xf7vbrk3SI5rYrfuvJo5t3CjpZ5K2SXpP0j8kzRug2jZoYmrvNzQRrIUt1bZME7vob0jaWfwtb3vbldTVl+3G5bJAEhygA5Ig7EAShB1IgrADSRB2IAnCDiRB2IEk/gciQMnFdlEPHAAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, - "metadata": {}, + "metadata": { + "needs_background": "light" + }, "output_type": "display_data" }, { @@ -331,12 +337,14 @@ }, { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAADbVJREFUeJzt3W2IXPUVx/HfSWzfpH2hZE3jU9I2EitCTVljoRKtxZKUStIX0YhIiqUbJRoLfVFJwEaKINqmLRgSthi6BbUK0bqE0KaINBWCuJFaNVtblTVNs2yMEWsI0picvti7siY7/zuZuU+b8/2AzMOZuXO8+tt7Z/733r+5uwDEM6PuBgDUg/ADQRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwjqnCo/zMw4nBAombtbO6/rastvZkvN7A0ze9PM7u1mWQCqZZ0e229mMyX9U9INkg5IeknSLe6+L/EetvxAyarY8i+W9Ka7v+3u/5P0e0nLu1gegAp1E/4LJf170uMD2XOfYmZ9ZjZkZkNdfBaAgnXzg99Uuxan7da7e7+kfondfqBJutnyH5B08aTHF0k62F07AKrSTfhfknSpmX3RzD4raZWkwWLaAlC2jnf73f1jM7tL0p8kzZS0zd1fL6wzAKXqeKivow/jOz9QukoO8gEwfRF+ICjCDwRF+IGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0ERfiAowg8ERfiBoAg/EFSlU3SjerNmzUrWH3744WR9zZo1yfrevXuT9ZUrV7asvfPOO8n3olxs+YGgCD8QFOEHgiL8QFCEHwiK8ANBEX4gqK5m6TWzEUkfSjoh6WN37815PbP0VmzBggXJ+vDwcFfLnzEjvf1Yt25dy9rmzZu7+mxMrd1Zeos4yOeb7n64gOUAqBC7/UBQ3YbfJe0ys71m1ldEQwCq0e1u/zfc/aCZnS/pz2b2D3ffPfkF2R8F/jAADdPVlt/dD2a3hyQ9I2nxFK/pd/fevB8DAVSr4/Cb2Swz+/zEfUnflvRaUY0BKFc3u/1zJD1jZhPLedzd/1hIVwBK13H43f1tSV8tsBd0qKenp2VtYGCgwk4wnTDUBwRF+IGgCD8QFOEHgiL8QFCEHwiKS3dPA6nTYiVpxYoVLWuLF5920GWllixZ0rKWdzrwK6+8kqzv3r07WUcaW34gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCKqrS3ef8Ydx6e6OnDhxIlk/efJkRZ2cLm+svpve8qbwvvnmm5P1vOnDz1btXrqbLT8QFOEHgiL8QFCEHwiK8ANBEX4gKMIPBMU4fwPs3LkzWV+2bFmyXuc4/3vvvZesHz16tGVt3rx5RbfzKTNnzix1+U3FOD+AJMIPBEX4gaAIPxAU4QeCIvxAUIQfCCr3uv1mtk3SdyUdcvcrsufOk/SkpPmSRiTd5O7vl9fm9Hbttdcm6wsXLkzW88bxyxzn37p1a7K+a9euZP2DDz5oWbv++uuT792wYUOynufOO+9sWduyZUtXyz4btLPl/62kpac8d6+k59z9UknPZY8BTCO54Xf33ZKOnPL0ckkD2f0BSa2njAHQSJ1+55/j7qOSlN2eX1xLAKpQ+lx9ZtYnqa/szwFwZjrd8o+Z2VxJym4PtXqhu/e7e6+793b4WQBK0Gn4ByWtzu6vlvRsMe0AqEpu+M3sCUl7JC00swNm9gNJD0q6wcz+JemG7DGAaYTz+Qswf/78ZH3Pnj3J+uzZs5P1bq6Nn3ft++3btyfr999/f7J+7NixZD0l73z+vPXW09OTrH/00Ucta/fdd1/yvY888kiyfvz48WS9TpzPDyCJ8ANBEX4gKMIPBEX4gaAIPxAUQ30FWLBgQbI+PDzc1fLzhvqef/75lrVVq1Yl33v48OGOeqrC3Xffnaxv2rQpWU+tt7zToC+77LJk/a233krW68RQH4Akwg8ERfiBoAg/EBThB4Ii/EBQhB8IqvTLeKF7Q0NDyfrtt9/estbkcfw8g4ODyfqtt96arF911VVFtnPWYcsPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0Exzl+BvPPx81x99dUFdTK9mKVPS89br92s940bNybrt912W8fLbgq2/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QVO44v5ltk/RdSYfc/YrsuY2Sfijp3exl6919Z1lNNt0dd9yRrOddIx5Tu/HGG5P1RYsWJeup9Z733yRvnP9s0M6W/7eSlk7x/C/d/crsn7DBB6ar3PC7+25JRyroBUCFuvnOf5eZ/d3MtpnZuYV1BKASnYZ/i6QvS7pS0qikX7R6oZn1mdmQmaUvRAegUh2F393H3P2Eu5+U9BtJixOv7Xf3Xnfv7bRJAMXrKPxmNnfSw+9Jeq2YdgBUpZ2hvickXSdptpkdkPRTSdeZ2ZWSXNKIpDUl9gigBLnhd/dbpnj60RJ6mbbyxqMj6+npaVm7/PLLk+9dv3590e184t13303Wjx8/XtpnNwVH+AFBEX4gKMIPBEX4gaAIPxAU4QeC4tLdKNWGDRta1tauXVvqZ4+MjLSsrV69Ovne/fv3F9xN87DlB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgGOdHV3buTF+4eeHChRV1crp9+/a1rL3wwgsVdtJMbPmBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjG+QtgZsn6jBnd/Y1dtmxZx+/t7+9P1i+44IKOly3l/7vVOT05l1RPY8sPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0HljvOb2cWSfifpC5JOSup391+b2XmSnpQ0X9KIpJvc/f3yWm2uLVu2JOsPPfRQV8vfsWNHst7NWHrZ4/BlLn/r1q2lLTuCdrb8H0v6sbt/RdLXJa01s8sl3SvpOXe/VNJz2WMA00Ru+N191N1fzu5/KGlY0oWSlksayF42IGlFWU0CKN4Zfec3s/mSFkl6UdIcdx+Vxv9ASDq/6OYAlKftY/vN7HOStkv6kbv/N+949knv65PU11l7AMrS1pbfzD6j8eA/5u5PZ0+PmdncrD5X0qGp3uvu/e7e6+69RTQMoBi54bfxTfyjkobdfdOk0qCkialOV0t6tvj2AJTF3D39ArNrJP1V0qsaH+qTpPUa/97/lKRLJO2XtNLdj+QsK/1h09S8efOS9T179iTrPT09yXqTT5vN621sbKxlbXh4OPnevr70t8XR0dFk/dixY8n62crd2/pOnvud391fkNRqYd86k6YANAdH+AFBEX4gKMIPBEX4gaAIPxAU4QeCyh3nL/TDztJx/jxLlixJ1lesSJ8Tdc899yTrTR7nX7duXcva5s2bi24Han+cny0/EBThB4Ii/EBQhB8IivADQRF+ICjCDwTFOP80sHTp0mQ9dd573jTVg4ODyXreFN95l3Pbt29fy9r+/fuT70VnGOcHkET4gaAIPxAU4QeCIvxAUIQfCIrwA0Exzg+cZRjnB5BE+IGgCD8QFOEHgiL8QFCEHwiK8ANB5YbfzC42s+fNbNjMXjeze7LnN5rZf8zsb9k/3ym/XQBFyT3Ix8zmSprr7i+b2ecl7ZW0QtJNko66+8/b/jAO8gFK1+5BPue0saBRSaPZ/Q/NbFjShd21B6BuZ/Sd38zmS1ok6cXsqbvM7O9mts3Mzm3xnj4zGzKzoa46BVCoto/tN7PPSfqLpAfc/WkzmyPpsCSX9DONfzW4PWcZ7PYDJWt3t7+t8JvZZyTtkPQnd980RX2+pB3ufkXOcgg/ULLCTuyx8cuzPippeHLwsx8CJ3xP0mtn2iSA+rTza/81kv4q6VVJE3NBr5d0i6QrNb7bPyJpTfbjYGpZbPmBkhW6218Uwg+Uj/P5ASQRfiAowg8ERfiBoAg/EBThB4Ii/EBQhB8IivADQRF+ICjCDwRF+IGgCD8QFOEHgsq9gGfBDkt6Z9Lj2dlzTdTU3pral0RvnSqyt3ntvrDS8/lP+3CzIXfvra2BhKb21tS+JHrrVF29sdsPBEX4gaDqDn9/zZ+f0tTemtqXRG+dqqW3Wr/zA6hP3Vt+ADWpJfxmttTM3jCzN83s3jp6aMXMRszs1Wzm4VqnGMumQTtkZq9Neu48M/uzmf0ru51ymrSaemvEzM2JmaVrXXdNm/G68t1+M5sp6Z+SbpB0QNJLkm5x932VNtKCmY1I6nX32seEzWyJpKOSfjcxG5KZPSTpiLs/mP3hPNfdf9KQ3jbqDGduLqm3VjNLf181rrsiZ7wuQh1b/sWS3nT3t939f5J+L2l5DX00nrvvlnTklKeXSxrI7g9o/H+eyrXorRHcfdTdX87ufyhpYmbpWtddoq9a1BH+CyX9e9LjA2rWlN8uaZeZ7TWzvrqbmcKciZmRstvza+7nVLkzN1fplJmlG7PuOpnxumh1hH+q2USaNOTwDXf/mqRlktZmu7dozxZJX9b4NG6jkn5RZzPZzNLbJf3I3f9bZy+TTdFXLeutjvAfkHTxpMcXSTpYQx9TcveD2e0hSc9o/GtKk4xNTJKa3R6quZ9PuPuYu59w95OSfqMa1102s/R2SY+5+9PZ07Wvu6n6qmu91RH+lyRdamZfNLPPSlolabCGPk5jZrOyH2JkZrMkfVvNm314UNLq7P5qSc/W2MunNGXm5lYzS6vmdde0Ga9rOcgnG8r4laSZkra5+wOVNzEFM/uSxrf20vgZj4/X2ZuZPSHpOo2f9TUm6aeS/iDpKUmXSNovaaW7V/7DW4vertMZztxcUm+tZpZ+UTWuuyJnvC6kH47wA2LiCD8gKMIPBEX4gaAIPxAU4QeCIvxAUIQfCIrwA0H9HwAENgeMtPBpAAAAAElFTkSuQmCC\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAANrUlEQVR4nO3df4gU9xnH8c+jbf+x/UPrVcyPaluDQQqNxZhCg0lTWjQQvP6RRgnBksKZYKKBQisKqaEUQtKm/0SUCwm9ljalYNIeIq2pSG1ASs6QH+aubX6gVrmcMUIakRCjT//YMZx6853LzszOns/7BcfuzrM7+2SST2Z2vzvzNXcXgMvftKYbANAZhB0IgrADQRB2IAjCDgTxqU6+mZnx1T9QM3e3iZaX2rOb2XIz+7eZvWFmG8usC0C9rN1xdjObLuk/kr4j6aikFyStdvfhxGvYswM1q2PPvlTSG+7+lrt/KOkPklaWWB+AGpUJ+5WS/jvu8dFs2QXMrM/MhsxsqMR7ASip9i/o3L1fUr/EYTzQpDJ79mOSrh73+KpsGYAuVCbsL0i6xsy+ZGafkbRK0mA1bQGoWtuH8e7+kZndJ+mvkqZLesrdX6usMwCVanvora034zM7ULtaflQDYOog7EAQhB0IgrADQRB2IAjCDgRB2IEgCDsQBGEHgiDsQBCEHQiCsANBEHYgCMIOBEHYgSAIOxAEYQeCIOxAEIQdCIKwA0EQdiCIjk7ZjM6bMWNGsv7oo48m62vXrk3WDxw4kKzffvvtubXDhw8nX4tqsWcHgiDsQBCEHQiCsANBEHYgCMIOBEHYgSCYxfUyt2DBgmR9ZGSk1PqnTUvvL9avX59b27p1a6n3xsTyZnEt9aMaMzsk6X1JZyV95O5LyqwPQH2q+AXdt9z9RAXrAVAjPrMDQZQNu0vabWYHzKxvoieYWZ+ZDZnZUMn3AlBC2cP4G939mJl9QdJzZvYvd983/gnu3i+pX+ILOqBJpfbs7n4suz0u6VlJS6toCkD12g67mc0ws8+dvy/pu5IOVtUYgGqVOYyfI+lZMzu/nt+7+18q6QqfSE9PT25tYGCgg52gm7Uddnd/S9LXKuwFQI0YegOCIOxAEIQdCIKwA0EQdiAILiU9BaROE5Wk3t7e3NrSpc3+zmnZsmW5taLTY19++eVkfd++fck6LsSeHQiCsANBEHYgCMIOBEHYgSAIOxAEYQeC4FLSU8DZs2eT9XPnznWok0sVjZWX6a1oSuc77rgjWS+aTvpylXcpafbsQBCEHQiCsANBEHYgCMIOBEHYgSAIOxAE4+xdYNeuXcn6ihUrkvUmx9nffffdZP3UqVO5tXnz5lXdzgWmT59e6/q7FePsQHCEHQiCsANBEHYgCMIOBEHYgSAIOxAE143vgJtuuilZX7hwYbJeNI5e5zj79u3bk/Xdu3cn6++9915u7ZZbbkm+dvPmzcl6kXvvvTe3tm3btlLrnooK9+xm9pSZHTezg+OWzTKz58zs9ex2Zr1tAihrMofxv5a0/KJlGyXtcfdrJO3JHgPoYoVhd/d9kk5etHilpIHs/oCk3mrbAlC1dj+zz3H30ez+25Lm5D3RzPok9bX5PgAqUvoLOnf31Aku7t4vqV/iRBigSe0OvY2Z2VxJym6PV9cSgDq0G/ZBSWuy+2sk/bmadgDUpfB8djN7WtLNkmZLGpP0U0l/kvRHSV+UdFjS99394i/xJlrXZXkYP3/+/GR9//79yfrs2bOT9TLXZi+69vqOHTuS9YceeihZP336dLKeUnQ+e9F26+npSdY/+OCD3NqDDz6YfO3jjz+erJ85cyZZb1Le+eyFn9ndfXVO6dulOgLQUfxcFgiCsANBEHYgCMIOBEHYgSC4lHQFFixYkKyPjIyUWn/R0NvevXtza6tWrUq+9sSJE2311An3339/sv7YY48l66ntVnRa8LXXXpusv/nmm8l6k7iUNBAcYQeCIOxAEIQdCIKwA0EQdiAIwg4EwaWkp4ChoaFk/e67786tdfM4epHBwcFk/c4770zWr7/++irbmfLYswNBEHYgCMIOBEHYgSAIOxAEYQeCIOxAEIyzd0DR+ehFbrjhhoo6mVrMJjwt+2NF27XMdt+yZUuyftddd7W97qawZweCIOxAEIQdCIKwA0EQdiAIwg4EQdiBIBhnr8A999yTrBddoxwTu+2225L1xYsXJ+up7V7076RonH0qKtyzm9lTZnbczA6OW7bFzI6Z2UvZ3631tgmgrMkcxv9a0vIJlv/K3a/L/nZV2xaAqhWG3d33STrZgV4A1KjMF3T3mdkr2WH+zLwnmVmfmQ2ZWfpCagBq1W7Yt0n6iqTrJI1K+mXeE929392XuPuSNt8LQAXaCru7j7n7WXc/J+kJSUurbQtA1doKu5nNHffwe5IO5j0XQHcoHGc3s6cl3SxptpkdlfRTSTeb2XWSXNIhSWvra7H7FY0HR9bT05NbW7RoUfK1mzZtqrqdj73zzjvJ+pkzZ2p776YUht3dV0+w+MkaegFQI34uCwRB2IEgCDsQBGEHgiDsQBCc4opabd68Obe2bt26Wt/70KFDubU1a9YkX3vkyJGKu2kee3YgCMIOBEHYgSAIOxAEYQeCIOxAEIQdCIJxdpSya1f6WqMLFy7sUCeXGh4ezq09//zzHeykO7BnB4Ig7EAQhB0IgrADQRB2IAjCDgRB2IEgGGevgJkl69Omlft/6ooVK9p+bX9/f7J+xRVXtL1uqfifrcnpqrnE94XYswNBEHYgCMIOBEHYgSAIOxAEYQeCIOxAEIyzV2Dbtm3J+iOPPFJq/Tt37kzWy4xl1z0OXuf6t2/fXtu6L0eFe3Yzu9rM9prZsJm9ZmYbsuWzzOw5M3s9u51Zf7sA2jWZw/iPJP3I3RdJ+oakdWa2SNJGSXvc/RpJe7LHALpUYdjdfdTdX8zuvy9pRNKVklZKGsieNiCpt6YeAVTgE31mN7P5khZL+qekOe4+mpXeljQn5zV9kvpK9AigApP+Nt7MPitph6QH3P1/42vu7pJ8ote5e7+7L3H3JaU6BVDKpMJuZp9WK+i/c/dnssVjZjY3q8+VdLyeFgFUwVo75cQTWudvDkg66e4PjFv+qKR33f1hM9soaZa7/7hgXek3m6LmzZuXrO/fvz9Z7+npSda7+TTSot7GxsZyayMjI8nX9vWlP/2Njo4m66dPn07WL1fuPuE515P5zP5NSXdJetXMXsqWbZL0sKQ/mtkPJR2W9P0K+gRQk8Kwu/vzkvKuzvDtatsBUBd+LgsEQdiBIAg7EARhB4Ig7EAQhePslb7ZZTrOXmTZsmXJem9vb7K+YcOGZL2bx9nXr1+fW9u6dWvV7UD54+zs2YEgCDsQBGEHgiDsQBCEHQiCsANBEHYgCMbZp4Dly5cn66nzvoumLR4cHEzWi6Z8Lpquenh4OLd25MiR5GvRHsbZgeAIOxAEYQeCIOxAEIQdCIKwA0EQdiAIxtmBywzj7EBwhB0IgrADQRB2IAjCDgRB2IEgCDsQRGHYzexqM9trZsNm9pqZbciWbzGzY2b2UvZ3a/3tAmhX4Y9qzGyupLnu/qKZfU7SAUm9as3HfsrdfzHpN+NHNUDt8n5UM5n52UcljWb33zezEUlXVtsegLp9os/sZjZf0mJJ/8wW3Wdmr5jZU2Y2M+c1fWY2ZGZD5VoFUMakfxtvZp+V9HdJP3f3Z8xsjqQTklzSz9Q61L+7YB0cxgM1yzuMn1TYzezTknZK+qu7PzZBfb6kne7+1YL1EHagZm2fCGOty4c+KWlkfNCzL+7O+56kg2WbBFCfyXwbf6Okf0h6VdL5uYE3SVot6Tq1DuMPSVqbfZmXWhd7dqBmpQ7jq0LYgfpxPjsQHGEHgiDsQBCEHQiCsANBEHYgCMIOBEHYgSAIOxAEYQeCIOxAEIQdCIKwA0EQdiCIwgtOVuyEpMPjHs/OlnWjbu2tW/uS6K1dVfY2L6/Q0fPZL3lzsyF3X9JYAwnd2lu39iXRW7s61RuH8UAQhB0Ioumw9zf8/ind2lu39iXRW7s60lujn9kBdE7Te3YAHULYgSAaCbuZLTezf5vZG2a2sYke8pjZITN7NZuGutH56bI59I6b2cFxy2aZ2XNm9np2O+Ecew311hXTeCemGW902zU9/XnHP7Ob2XRJ/5H0HUlHJb0gabW7D3e0kRxmdkjSEndv/AcYZrZM0ilJvzk/tZaZPSLppLs/nP2Pcqa7/6RLetuiTziNd0295U0z/gM1uO2qnP68HU3s2ZdKesPd33L3DyX9QdLKBvroeu6+T9LJixavlDSQ3R9Q6z+WjsvprSu4+6i7v5jdf1/S+WnGG912ib46oomwXynpv+MeH1V3zffuknab2QEz62u6mQnMGTfN1tuS5jTZzAQKp/HupIumGe+abdfO9Odl8QXdpW50969LWiFpXXa42pW89Rmsm8ZOt0n6ilpzAI5K+mWTzWTTjO+Q9IC7/298rcltN0FfHdluTYT9mKSrxz2+KlvWFdz9WHZ7XNKzan3s6CZj52fQzW6PN9zPx9x9zN3Puvs5SU+owW2XTTO+Q9Lv3P2ZbHHj226ivjq13ZoI+wuSrjGzL5nZZyStkjTYQB+XMLMZ2RcnMrMZkr6r7puKelDSmuz+Gkl/brCXC3TLNN5504yr4W3X+PTn7t7xP0m3qvWN/JuSNjfRQ05fX5b0cvb3WtO9SXparcO6M2p9t/FDSZ+XtEfS65L+JmlWF/X2W7Wm9n5FrWDNbai3G9U6RH9F0kvZ361Nb7tEXx3ZbvxcFgiCL+iAIAg7EARhB4Ig7EAQhB0IgrADQRB2IIj/A8nhboC3dEL1AAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] }, - "metadata": {}, + "metadata": { + "needs_background": "light" + }, "output_type": "display_data" }, { @@ -348,12 +356,14 @@ }, { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP8AAAD8CAYAAAC4nHJkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAADXZJREFUeJzt3X+oXPWZx/HPZ00bMQ2SS0ga0uzeGmVdCW6qF1GUqhRjNlZi0UhCWLJaevtHhRb3jxUVKmpBZJvd/mMgxdAIbdqicQ219AcS1xUWyY2EmvZu2xiyTZqQH6ahiQSquU//uOfKNblzZjJzZs7c+7xfIDNznnNmHo753O85c2bm64gQgHz+pu4GANSD8ANJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUoQfSGpWL1/MNh8nBLosItzKeh2N/LZX2v6t7X22H+nkuQD0ltv9bL/tSyT9TtIdkg5J2iVpXUT8pmQbRn6gy3ox8t8gaV9E7I+Iv0j6oaTVHTwfgB7qJPyLJR2c9PhQsexjbA/bHrE90sFrAahYJ2/4TXVoccFhfURslrRZ4rAf6CedjPyHJC2Z9Pgzkg531g6AXukk/LskXWX7s7Y/KWmtpB3VtAWg29o+7I+ID20/JOnnki6RtCUifl1ZZwC6qu1LfW29GOf8QNf15EM+AKYvwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0iK8ANJEX4gKcIPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kBThB5Jqe4puSbJ9QNJpSeckfRgRQ1U0hY+77rrrSuvbt29vWBscHKy4m/6xYsWK0vro6GjD2sGDB6tuZ9rpKPyF2yPiRAXPA6CHOOwHkuo0/CHpF7Z32x6uoiEAvdHpYf/NEXHY9gJJv7T9fxHxxuQVij8K/GEA+kxHI39EHC5uj0l6WdINU6yzOSKGeDMQ6C9th9/2HNtzJ+5LWiFpb1WNAeiuTg77F0p62fbE8/wgIn5WSVcAuq7t8EfEfkn/WGEvaODOO+8src+ePbtHnfSXu+++u7T+4IMPNqytXbu26namHS71AUkRfiApwg8kRfiBpAg/kBThB5Kq4lt96NCsWeX/G1atWtWjTqaX3bt3l9YffvjhhrU5c+aUbvv++++31dN0wsgPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0lxnb8P3H777aX1m266qbT+7LPPVtnOtDFv3rzS+jXXXNOwdtlll5Vuy3V+ADMW4QeSIvxAUoQfSIrwA0kRfiApwg8k5Yjo3YvZvXuxPrJs2bLS+uuvv15af++990rr119/fcPamTNnSredzprtt1tuuaVhbdGiRaXbHj9+vJ2W+kJEuJX1GPmBpAg/kBThB5Ii/EBShB9IivADSRF+IKmm3+e3vUXSFyUdi4hlxbIBST+SNCjpgKT7I+JP3Wtzenv88cdL681+Q37lypWl9Zl6LX9gYKC0fuutt5bWx8bGqmxnxmll5P+epPP/9T0i6bWIuErSa8VjANNI0/BHxBuSTp63eLWkrcX9rZLuqbgvAF3W7jn/wog4IknF7YLqWgLQC13/DT/bw5KGu/06AC5OuyP/UduLJKm4PdZoxYjYHBFDETHU5msB6IJ2w79D0obi/gZJr1TTDoBeaRp+29sk/a+kv7d9yPaXJT0j6Q7bv5d0R/EYwDTS9Jw/ItY1KH2h4l6mrfvuu6+0vmrVqtL6vn37SusjIyMX3dNM8Nhjj5XWm13HL/u+/6lTp9ppaUbhE35AUoQfSIrwA0kRfiApwg8kRfiBpJiiuwJr1qwprTebDvq5556rsp1pY3BwsLS+fv360vq5c+dK608//XTD2gcffFC6bQaM/EBShB9IivADSRF+ICnCDyRF+IGkCD+QFNf5W3T55Zc3rN14440dPfemTZs62n66Gh4u/3W3+fPnl9ZHR0dL6zt37rzonjJh5AeSIvxAUoQfSIrwA0kRfiApwg8kRfiBpLjO36LZs2c3rC1evLh0223btlXdzoywdOnSjrbfu3dvRZ3kxMgPJEX4gaQIP5AU4QeSIvxAUoQfSIrwA0k1vc5ve4ukL0o6FhHLimVPSPqKpOPFao9GxE+71WQ/OH36dMPanj17Sre99tprS+sDAwOl9ZMnT5bW+9mCBQsa1ppNbd7Mm2++2dH22bUy8n9P0soplv9HRCwv/pvRwQdmoqbhj4g3JE3foQfAlDo553/I9q9sb7E9r7KOAPREu+HfJGmppOWSjkj6dqMVbQ/bHrE90uZrAeiCtsIfEUcj4lxEjEn6rqQbStbdHBFDETHUbpMAqtdW+G0vmvTwS5L4ehUwzbRyqW+bpNskzbd9SNI3Jd1me7mkkHRA0le72COALmga/ohYN8Xi57vQS187e/Zsw9q7775buu29995bWn/11VdL6xs3biytd9OyZctK61dccUVpfXBwsGEtItpp6SNjY2MdbZ8dn/ADkiL8QFKEH0iK8ANJEX4gKcIPJOVOL7dc1IvZvXuxHrr66qtL608++WRp/a677iqtl/1seLedOHGitN7s30/ZNNu22+ppwty5c0vrZZdnZ7KIaGnHMvIDSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFJc5+8Dy5cvL61feeWVPerkQi+++GJH22/durVhbf369R0996xZzDA/Fa7zAyhF+IGkCD+QFOEHkiL8QFKEH0iK8ANJcaG0DzSb4rtZvZ/t37+/a8/d7GfF9+5lLpkyjPxAUoQfSIrwA0kRfiApwg8kRfiBpAg/kFTT6/y2l0h6QdKnJY1J2hwR37E9IOlHkgYlHZB0f0T8qXutYjoq+23+Tn+3n+v4nWll5P9Q0r9GxD9IulHS12xfI+kRSa9FxFWSXiseA5gmmoY/Io5ExNvF/dOSRiUtlrRa0sTPtGyVdE+3mgRQvYs657c9KOlzkt6StDAijkjjfyAkLai6OQDd0/Jn+21/StJLkr4REX9u9XzN9rCk4fbaA9AtLY38tj+h8eB/PyK2F4uP2l5U1BdJOjbVthGxOSKGImKoioYBVKNp+D0+xD8vaTQiNk4q7ZC0obi/QdIr1bcHoFtaOey/WdI/S3rH9sR3Sx+V9IykH9v+sqQ/SFrTnRYxnZX9NHwvfzYeF2oa/oh4U1KjE/wvVNsOgF7hE35AUoQfSIrwA0kRfiApwg8kRfiBpPjpbnTVpZde2va2Z8+erbATnI+RH0iK8ANJEX4gKcIPJEX4gaQIP5AU4QeS4jo/uuqBBx5oWDt16lTptk899VTV7WASRn4gKcIPJEX4gaQIP5AU4QeSIvxAUoQfSIrr/OiqXbt2Naxt3LixYU2Sdu7cWXU7mISRH0iK8ANJEX4gKcIPJEX4gaQIP5AU4QeScrM50m0vkfSCpE9LGpO0OSK+Y/sJSV+RdLxY9dGI+GmT52JCdqDLIsKtrNdK+BdJWhQRb9ueK2m3pHsk3S/pTET8e6tNEX6g+1oNf9NP+EXEEUlHivunbY9KWtxZewDqdlHn/LYHJX1O0lvFoods/8r2FtvzGmwzbHvE9khHnQKoVNPD/o9WtD8l6b8lfSsittteKOmEpJD0lMZPDR5s8hwc9gNdVtk5vyTZ/oSkn0j6eURc8G2M4ojgJxGxrMnzEH6gy1oNf9PDftuW9Lyk0cnBL94InPAlSXsvtkkA9Wnl3f5bJP2PpHc0fqlPkh6VtE7Sco0f9h+Q9NXizcGy52LkB7qs0sP+qhB+oPsqO+wHMDMRfiApwg8kRfiBpAg/kBThB5Ii/EBShB9IivADSRF+ICnCDyRF+IGkCD+QFOEHkur1FN0nJP3/pMfzi2X9qF9769e+JHprV5W9/V2rK/b0+/wXvLg9EhFDtTVQol9769e+JHprV129cdgPJEX4gaTqDv/mml+/TL/21q99SfTWrlp6q/WcH0B96h75AdSklvDbXmn7t7b32X6kjh4asX3A9ju299Q9xVgxDdox23snLRuw/Uvbvy9up5wmrabenrD9x2Lf7bG9qqbeltjeaXvU9q9tf71YXuu+K+mrlv3W88N+25dI+p2kOyQdkrRL0rqI+E1PG2nA9gFJQxFR+zVh25+XdEbSCxOzIdl+VtLJiHim+MM5LyL+rU96e0IXOXNzl3prNLP0v6jGfVfljNdVqGPkv0HSvojYHxF/kfRDSatr6KPvRcQbkk6et3i1pK3F/a0a/8fTcw166wsRcSQi3i7un5Y0MbN0rfuupK9a1BH+xZIOTnp8SP015XdI+oXt3baH625mCgsnZkYqbhfU3M/5ms7c3EvnzSzdN/uunRmvq1ZH+KeaTaSfLjncHBHXSfonSV8rDm/Rmk2Slmp8Grcjkr5dZzPFzNIvSfpGRPy5zl4mm6KvWvZbHeE/JGnJpMefkXS4hj6mFBGHi9tjkl7W+GlKPzk6MUlqcXus5n4+EhFHI+JcRIxJ+q5q3HfFzNIvSfp+RGwvFte+76bqq679Vkf4d0m6yvZnbX9S0lpJO2ro4wK25xRvxMj2HEkr1H+zD++QtKG4v0HSKzX28jH9MnNzo5mlVfO+67cZr2v5kE9xKeM/JV0iaUtEfKvnTUzB9hUaH+2l8W88/qDO3mxvk3Sbxr/1dVTSNyX9l6QfS/pbSX+QtCYiev7GW4PebtNFztzcpd4azSz9lmrcd1XOeF1JP3zCD8iJT/gBSRF+ICnCDyRF+IGkCD+QFOEHkiL8QFKEH0jqr8DO4JozFB6IAAAAAElFTkSuQmCC\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAD4CAYAAAAq5pAIAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/Il7ecAAAACXBIWXMAAAsTAAALEwEAmpwYAAANTUlEQVR4nO3db6hc9Z3H8c9nTRsxDZK7wRDSsKlRkBDcVIMoG1alNGYjEotaEsKSVdnbBxVa3AcrKlTUBZFtln1i4Bal6dJNKRox1LKtDXFdn5TcSFav3m2NIZKEmBhDaCKBavLdB/dErnrnzM3MOXPOzff9gsvMnO+cmS/HfPydPzPzc0QIwMXvL5puAMBgEHYgCcIOJEHYgSQIO5DErEG+mW1O/QM1iwhPtbyvkd32Gtt/sL3P9kP9vBaAernX6+y2L5H0R0nflnRI0m5JGyLinZJ1GNmBmtUxst8gaV9E7I+IP0v6haR1fbwegBr1E/ZFkg5OenyoWPY5todtj9oe7eO9APSp9hN0ETEiaURiNx5oUj8j+2FJiyc9/nqxDEAL9RP23ZKutv0N21+VtF7SjmraAlC1nnfjI+JT2w9I+o2kSyQ9FxFvV9YZgEr1fOmtpzfjmB2oXS0fqgEwcxB2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJEHYgCcIOJEHYgSQIO5AEYQeSIOxAEoQdSIKwA0kQdiAJwg4kQdiBJAg7kARhB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkCDuQRM9TNmNwrrvuutL69u3bO9aWLFlScTftsXr16tL6+Ph4x9rBgwerbqf1+gq77QOSTkk6K+nTiFhZRVMAqlfFyH5rRByv4HUA1IhjdiCJfsMekn5re4/t4ameYHvY9qjt0T7fC0Af+t2NXxURh21fIekV2/8XEa9NfkJEjEgakSTb0ef7AehRXyN7RBwubo9JelHSDVU0BaB6PYfd9hzbc8/fl7Ra0lhVjQGoVj+78QskvWj7/Ov8Z0T8VyVd4XNuu+220vrs2bMH1Em73HHHHaX1++67r2Nt/fr1VbfTej2HPSL2S/rrCnsBUCMuvQFJEHYgCcIOJEHYgSQIO5AEX3FtgVmzyv8zrF27dkCdzCx79uwprT/44IMda3PmzCld9+OPP+6ppzZjZAeSIOxAEoQdSIKwA0kQdiAJwg4kQdiBJLjO3gK33npraf2mm24qrT/99NNVtjNjzJs3r7S+bNmyjrXLLrusdF2uswOYsQg7kARhB5Ig7EAShB1IgrADSRB2IAlHDG6Slqwzwixfvry0/uqrr5bWP/roo9L69ddf37F2+vTp0nVnsm7bbdWqVR1rCxcuLF33ww8/7KWlVogIT7WckR1IgrADSRB2IAnCDiRB2IEkCDuQBGEHkuD77APw6KOPlta7/Yb5mjVrSusX67X0oaGh0vrNN99cWj937lyV7cx4XUd228/ZPmZ7bNKyIduv2H63uC3/FQEAjZvObvxPJX1xaHlI0s6IuFrSzuIxgBbrGvaIeE3SiS8sXidpa3F/q6Q7q20LQNV6PWZfEBFHivsfSFrQ6Ym2hyUN9/g+ACrS9wm6iIiyL7hExIikESnvF2GANuj10ttR2wslqbg9Vl1LAOrQa9h3SNpU3N8k6aVq2gFQl6678ba3SbpF0nzbhyT9SNJTkn5p+35J70v6bp1Ntt3dd99dWu82v/q+fftK66Ojoxfc08XgkUceKa13u45e9n33kydP9tDRzNY17BGxoUPpWxX3AqBGfFwWSIKwA0kQdiAJwg4kQdiBJPiKawXuueee0nq36YGfeeaZKtuZMZYsWVJa37hxY2n97NmzpfUnn3yyY+2TTz4pXfdixMgOJEHYgSQIO5AEYQeSIOxAEoQdSIKwA0lwnX2aLr/88o61G2+8sa/X3rJlS1/rz1TDw+W/VjZ//vzS+vj4eGl9165dF9zTxYyRHUiCsANJEHYgCcIOJEHYgSQIO5AEYQeS4Dr7NM2ePbtjbdGiRaXrbtu2rep2LgpLly7ta/2xsbHuT8JnGNmBJAg7kARhB5Ig7EAShB1IgrADSRB2IAmus0/TqVOnOtb27t1buu61115bWh8aGiqtnzhxorTeZldccUXHWreprrt5/fXX+1o/m64ju+3nbB+zPTZp2WO2D9veW/yVT0AOoHHT2Y3/qaQ1Uyz/t4hYUfz9utq2AFSta9gj4jVJM3c/EoCk/k7QPWD7zWI3f16nJ9ketj1qe7SP9wLQp17DvkXSUkkrJB2R9ONOT4yIkYhYGREre3wvABXoKewRcTQizkbEOUk/kXRDtW0BqFpPYbe9cNLD70jiu4ZAy3W9zm57m6RbJM23fUjSjyTdYnuFpJB0QNL36muxHc6cOdOx9t5775Wue9ddd5XWX3755dL65s2bS+t1Wr58eWn9yiuvLK2XzcEeEb209Jlz5871tX42XcMeERumWPxsDb0AqBEflwWSIOxAEoQdSIKwA0kQdiAJ93v544LezB7cmw3QNddcU1p//PHHS+u33357ab3sZ6zrdvz48dJ6t38/ZdMu2+6pp/Pmzp1bWi+7XHoxi4gpNywjO5AEYQeSIOxAEoQdSIKwA0kQdiAJwg4kwXX2FlixYkVp/aqrrhpMI1N4/vnn+1p/69atHWsbN27s67VnzeKX0KfCdXYgOcIOJEHYgSQIO5AEYQeSIOxAEoQdSIILlS3QbcrnbvU2279/f22v3e1nrsfGmM5gMkZ2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiC6+yoVdlvw/f7u/FcR78wXUd224tt77L9ju23bf+gWD5k+xXb7xa38+pvF0CvprMb/6mkf4qIZZJulPR928skPSRpZ0RcLWln8RhAS3UNe0QciYg3ivunJI1LWiRpnaTzvzm0VdKdNfUIoAIXdMxue4mkb0r6vaQFEXGkKH0gaUGHdYYlDffRI4AKTPtsvO2vSXpB0g8j4k+TazHxq5VT/phkRIxExMqIWNlXpwD6Mq2w2/6KJoL+84jYXiw+anthUV8o6Vg9LQKownTOxlvSs5LGI2LzpNIOSZuK+5skvVR9e5jpIqK2P1yY6Ryz/42kv5f0lu29xbKHJT0l6Ze275f0vqTv1tIhgEp0DXtEvC6p06cfvlVtOwDqwsdlgSQIO5AEYQeSIOxAEoQdSIKvuKJWl156ac/rnjlzpsJOwMgOJEHYgSQIO5AEYQeSIOxAEoQdSIKwA0lwnR21uvfeezvWTp48WbruE088UXE3uTGyA0kQdiAJwg4kQdiBJAg7kARhB5Ig7EASXGdHrXbv3t2xtnnz5o41Sdq1a1fV7aTGyA4kQdiBJAg7kARhB5Ig7EAShB1IgrADSbjbPNe2F0v6maQFkkLSSET8u+3HJP2jpA+Lpz4cEb/u8lpMqg3ULCKmnHV5OmFfKGlhRLxhe66kPZLu1MR87Kcj4l+n2wRhB+rXKezTmZ/9iKQjxf1TtsclLaq2PQB1u6BjdttLJH1T0u+LRQ/YftP2c7bndVhn2Pao7dH+WgXQj6678Z890f6apP+W9C8Rsd32AknHNXEc/4QmdvXv6/Ia7MYDNev5mF2SbH9F0q8k/SYivvTthWLE/1VELO/yOoQdqFmnsHfdjbdtSc9KGp8c9OLE3XnfkTTWb5MA6jOds/GrJP2PpLcknSsWPyxpg6QVmtiNPyDpe8XJvLLXYmQHatbXbnxVCDtQv5534wFcHAg7kARhB5Ig7EAShB1IgrADSRB2IAnCDiRB2IEkCDuQBGEHkiDsQBKEHUiCsANJDHrK5uOS3p/0eH6xrI3a2ltb+5LorVdV9vZXnQoD/T77l97cHo2IlY01UKKtvbW1L4neejWo3tiNB5Ig7EASTYd9pOH3L9PW3tral0RvvRpIb40eswMYnKZHdgADQtiBJBoJu+01tv9ge5/th5rooRPbB2y/ZXtv0/PTFXPoHbM9NmnZkO1XbL9b3E45x15DvT1m+3Cx7fbaXttQb4tt77L9ju23bf+gWN7otivpayDbbeDH7LYvkfRHSd+WdEjSbkkbIuKdgTbSge0DklZGROMfwLD9t5JOS/rZ+am1bD8t6UREPFX8j3JeRPxzS3p7TBc4jXdNvXWaZvwf1OC2q3L68140MbLfIGlfROyPiD9L+oWkdQ300XoR8ZqkE19YvE7S1uL+Vk38Yxm4Dr21QkQciYg3ivunJJ2fZrzRbVfS10A0EfZFkg5OenxI7ZrvPST91vYe28NNNzOFBZOm2fpA0oImm5lC12m8B+kL04y3Ztv1Mv15vzhB92WrIuI6SX8n6fvF7morxcQxWJuunW6RtFQTcwAekfTjJpspphl/QdIPI+JPk2tNbrsp+hrIdmsi7IclLZ70+OvFslaIiMPF7TFJL2risKNNjp6fQbe4PdZwP5+JiKMRcTYizkn6iRrcdsU04y9I+nlEbC8WN77tpuprUNutibDvlnS17W/Y/qqk9ZJ2NNDHl9ieU5w4ke05klarfVNR75C0qbi/SdJLDfbyOW2ZxrvTNONqeNs1Pv15RAz8T9JaTZyRf0/SI0300KGvKyX9b/H3dtO9Sdqmid26TzRxbuN+SX8paaekdyX9TtJQi3r7D01M7f2mJoK1sKHeVmliF/1NSXuLv7VNb7uSvgay3fi4LJAEJ+iAJAg7kARhB5Ig7EAShB1IgrADSRB2IIn/BwSyThmzraIZAAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] }, - "metadata": {}, + "metadata": { + "needs_background": "light" + }, "output_type": "display_data" }, { @@ -376,25 +386,32 @@ " plt.show()\n", " print(\"Model prediction: %i\" % np.argmax(predictions.numpy()[i]))" ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] } ], "metadata": { "kernelspec": { - "display_name": "Python 2", + "display_name": "Python 3", "language": "python", - "name": "python2" + "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", - "version": 2 + "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", - "pygments_lexer": "ipython2", - "version": "2.7.15" + "pygments_lexer": "ipython3", + "version": "3.8.0" } }, "nbformat": 4, diff --git a/tensorflow_v2/notebooks/4_Utils/tensorboard.ipynb b/tensorflow_v2/notebooks/4_Utils/tensorboard.ipynb new file mode 100644 index 00000000..b552d0e2 --- /dev/null +++ b/tensorflow_v2/notebooks/4_Utils/tensorboard.ipynb @@ -0,0 +1,350 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tensorboard\n", + "Graph, Loss, Accuracy & Weights visualization using Tensorboard and TensorFlow v2. This example is using the MNIST database of handwritten digits (http://yann.lecun.com/exdb/mnist/).\n", + "\n", + "- Author: Aymeric Damien\n", + "- Project: https://github.com/aymericdamien/TensorFlow-Examples/" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from __future__ import absolute_import, division, print_function\n", + "\n", + "import tensorflow as tf\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "# Path to save logs into.\n", + "logs_path = '/tmp/tensorflow_logs/example/'\n", + "\n", + "# MNIST dataset parameters.\n", + "num_classes = 10 # total classes (0-9 digits).\n", + "num_features = 784 # data features (img shape: 28*28).\n", + "\n", + "# Training parameters.\n", + "learning_rate = 0.001\n", + "training_steps = 3000\n", + "batch_size = 256\n", + "display_step = 100\n", + "\n", + "# Network parameters.\n", + "n_hidden_1 = 128 # 1st layer number of neurons.\n", + "n_hidden_2 = 256 # 2nd layer number of neurons." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "# Prepare MNIST data.\n", + "from tensorflow.keras.datasets import mnist\n", + "(x_train, y_train), (x_test, y_test) = mnist.load_data()\n", + "# Convert to float32.\n", + "x_train, x_test = np.array(x_train, np.float32), np.array(x_test, np.float32)\n", + "# Flatten images to 1-D vector of 784 features (28*28).\n", + "x_train, x_test = x_train.reshape([-1, num_features]), x_test.reshape([-1, num_features])\n", + "# Normalize images value from [0, 255] to [0, 1].\n", + "x_train, x_test = x_train / 255., x_test / 255." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "# Use tf.data API to shuffle and batch data.\n", + "train_data = tf.data.Dataset.from_tensor_slices((x_train, y_train))\n", + "train_data = train_data.repeat().shuffle(5000).batch(batch_size).prefetch(1)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "# Store layers weight & bias\n", + "\n", + "# A random value generator to initialize weights.\n", + "random_normal = tf.initializers.RandomNormal()\n", + "\n", + "weights = {\n", + " 'h1_weights': tf.Variable(random_normal([num_features, n_hidden_1]), name='h1_weights'),\n", + " 'h2_weights': tf.Variable(random_normal([n_hidden_1, n_hidden_2]), name='h2_weights'),\n", + " 'logits_weights': tf.Variable(random_normal([n_hidden_2, num_classes]), name='logits_weights')\n", + "}\n", + "biases = {\n", + " 'h1_bias': tf.Variable(tf.zeros([n_hidden_1]), name='h1_bias'),\n", + " 'h2_bias': tf.Variable(tf.zeros([n_hidden_2]), name='h2_bias'),\n", + " 'logits_bias': tf.Variable(tf.zeros([num_classes]), name='logits_bias')\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "# Construct model and encapsulating all ops into scopes, making\n", + "# Tensorboard's Graph visualization more convenient.\n", + "\n", + "# The computation graph to be traced.\n", + "@tf.function\n", + "def neural_net(x):\n", + " with tf.name_scope('Model'):\n", + " with tf.name_scope('HiddenLayer1'):\n", + " # Hidden fully connected layer with 128 neurons.\n", + " layer_1 = tf.add(tf.matmul(x, weights['h1_weights']), biases['h1_bias'])\n", + " # Apply sigmoid to layer_1 output for non-linearity.\n", + " layer_1 = tf.nn.sigmoid(layer_1)\n", + " with tf.name_scope('HiddenLayer2'):\n", + " # Hidden fully connected layer with 256 neurons.\n", + " layer_2 = tf.add(tf.matmul(layer_1, weights['h2_weights']), biases['h2_bias'])\n", + " # Apply sigmoid to layer_2 output for non-linearity.\n", + " layer_2 = tf.nn.sigmoid(layer_2)\n", + " with tf.name_scope('LogitsLayer'):\n", + " # Output fully connected layer with a neuron for each class.\n", + " out_layer = tf.matmul(layer_2, weights['logits_weights']) + biases['logits_bias']\n", + " # Apply softmax to normalize the logits to a probability distribution.\n", + " out_layer = tf.nn.softmax(out_layer)\n", + " return out_layer" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "# Cross-Entropy loss function.\n", + "def cross_entropy(y_pred, y_true):\n", + " with tf.name_scope('CrossEntropyLoss'):\n", + " # Encode label to a one hot vector.\n", + " y_true = tf.one_hot(y_true, depth=num_classes)\n", + " # Clip prediction values to avoid log(0) error.\n", + " y_pred = tf.clip_by_value(y_pred, 1e-9, 1.)\n", + " # Compute cross-entropy.\n", + " return tf.reduce_mean(-tf.reduce_sum(y_true * tf.math.log(y_pred)))\n", + "\n", + "# Accuracy metric.\n", + "def accuracy(y_pred, y_true):\n", + " with tf.name_scope('Accuracy'):\n", + " # Predicted class is the index of highest score in prediction vector (i.e. argmax).\n", + " correct_prediction = tf.equal(tf.argmax(y_pred, 1), tf.cast(y_true, tf.int64))\n", + " return tf.reduce_mean(tf.cast(correct_prediction, tf.float32), axis=-1)\n", + "\n", + "# Stochastic gradient descent optimizer.\n", + "with tf.name_scope('Optimizer'):\n", + " optimizer = tf.optimizers.SGD(learning_rate)" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "# Optimization process. \n", + "def run_optimization(x, y):\n", + " # Wrap computation inside a GradientTape for automatic differentiation.\n", + " with tf.GradientTape() as g:\n", + " pred = neural_net(x)\n", + " loss = cross_entropy(pred, y)\n", + " \n", + " # Variables to update, i.e. trainable variables.\n", + " trainable_variables = weights.values() + biases.values()\n", + "\n", + " # Compute gradients.\n", + " gradients = g.gradient(loss, trainable_variables)\n", + " \n", + " # Update weights/biases following gradients.\n", + " optimizer.apply_gradients(zip(gradients, trainable_variables))" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "# Visualize weights & biases as histogram in Tensorboard.\n", + "def summarize_weights(step):\n", + " for w in weights:\n", + " tf.summary.histogram(w.replace('_', '/'), weights[w], step=step)\n", + " for b in biases:\n", + " tf.summary.histogram(b.replace('_', '/'), biases[b], step=step)" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "# Create a Summary Writer to log the metrics to Tensorboad.\n", + "summary_writer = tf.summary.create_file_writer(logs_path)" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "step: 100, loss: 568.735596, accuracy: 0.140625\n", + "step: 200, loss: 413.169342, accuracy: 0.535156\n", + "step: 300, loss: 250.977036, accuracy: 0.714844\n", + "step: 400, loss: 173.749298, accuracy: 0.800781\n", + "step: 500, loss: 156.936569, accuracy: 0.839844\n", + "step: 600, loss: 137.818451, accuracy: 0.847656\n", + "step: 700, loss: 93.407814, accuracy: 0.929688\n", + "step: 800, loss: 90.832336, accuracy: 0.906250\n", + "step: 900, loss: 86.932831, accuracy: 0.914062\n", + "step: 1000, loss: 78.824707, accuracy: 0.906250\n", + "step: 1100, loss: 94.388290, accuracy: 0.902344\n", + "step: 1200, loss: 96.240608, accuracy: 0.894531\n", + "step: 1300, loss: 96.657593, accuracy: 0.898438\n", + "step: 1400, loss: 71.909309, accuracy: 0.914062\n", + "step: 1500, loss: 67.343407, accuracy: 0.941406\n", + "step: 1600, loss: 63.693596, accuracy: 0.941406\n", + "step: 1700, loss: 60.081478, accuracy: 0.914062\n", + "step: 1800, loss: 63.764942, accuracy: 0.921875\n", + "step: 1900, loss: 58.722507, accuracy: 0.921875\n", + "step: 2000, loss: 66.727455, accuracy: 0.917969\n", + "step: 2100, loss: 70.566788, accuracy: 0.949219\n", + "step: 2200, loss: 64.642334, accuracy: 0.925781\n", + "step: 2300, loss: 54.872856, accuracy: 0.941406\n", + "step: 2400, loss: 64.342377, accuracy: 0.925781\n", + "step: 2500, loss: 74.306488, accuracy: 0.921875\n", + "step: 2600, loss: 40.165890, accuracy: 0.949219\n", + "step: 2700, loss: 64.992249, accuracy: 0.925781\n", + "step: 2800, loss: 43.422794, accuracy: 0.957031\n", + "step: 2900, loss: 46.625320, accuracy: 0.937500\n", + "step: 3000, loss: 62.517433, accuracy: 0.914062\n" + ] + } + ], + "source": [ + "# Run training for the given number of steps.\n", + "for step, (batch_x, batch_y) in enumerate(train_data.take(training_steps), 1):\n", + " \n", + " # Start to trace the computation graph. The computation graph remains \n", + " # the same at each step, so we just need to export it once.\n", + " if step == 1:\n", + " tf.summary.trace_on(graph=True, profiler=True)\n", + " \n", + " # Run the optimization (computation graph).\n", + " run_optimization(batch_x, batch_y)\n", + " \n", + " # Export the computation graph to tensorboard after the first\n", + " # computation step was performed.\n", + " if step == 1:\n", + " with summary_writer.as_default():\n", + " tf.summary.trace_export(\n", + " name=\"trace\",\n", + " step=0,\n", + " profiler_outdir=logs_path)\n", + "\n", + " if step % display_step == 0:\n", + " pred = neural_net(batch_x)\n", + " loss = cross_entropy(pred, batch_y)\n", + " acc = accuracy(pred, batch_y)\n", + " print(\"step: %i, loss: %f, accuracy: %f\" % (step, loss, acc))\n", + " \n", + " # Write loss/acc metrics & weights to Tensorboard every few steps, \n", + " # to avoid storing too much data.\n", + " with summary_writer.as_default():\n", + " tf.summary.scalar('loss', loss, step=step)\n", + " tf.summary.scalar('accuracy', acc, step=step)\n", + " summarize_weights(step)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Run Tensorboard\n", + "\n", + "To run tensorboard, run the following command in your terminal:\n", + "```\n", + "tensorboard --logdir=/tmp/tensorflow_logs\n", + "```\n", + "\n", + "And then connect your web browser to: [http://localhost:6006](http://localhost:6006)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "![tensorboard1](../../../resources/img/tf2/tensorboard1.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "![tensorboard2](../../../resources/img/tf2/tensorboard2.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "![tensorboard3](../../../resources/img/tf2/tensorboard3.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "![tensorboard4](../../../resources/img/tf2/tensorboard4.png)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 2", + "language": "python", + "name": "python2" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 2 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython2", + "version": "2.7.18" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/tensorflow_v2/notebooks/6_Hardware/multigpu_training.ipynb b/tensorflow_v2/notebooks/6_Hardware/multigpu_training.ipynb new file mode 100644 index 00000000..46b07000 --- /dev/null +++ b/tensorflow_v2/notebooks/6_Hardware/multigpu_training.ipynb @@ -0,0 +1,371 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Multi-GPU Training Example\n", + "\n", + "Train a convolutional neural network on multiple GPU with TensorFlow 2.0+.\n", + "\n", + "- Author: Aymeric Damien\n", + "- Project: https://github.com/aymericdamien/TensorFlow-Examples/" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training with multiple GPU cards\n", + "\n", + "In this example, we are using data parallelism to split the training accross multiple GPUs. Each GPU has a full replica of the neural network model, and the weights (i.e. variables) are updated synchronously by waiting that each GPU process its batch of data.\n", + "\n", + "First, each GPU process a distinct batch of data and compute the corresponding gradients, then, all gradients are accumulated in the CPU and averaged. The model weights are finally updated with the gradients averaged, and the new model weights are sent back to each GPU, to repeat the training process.\n", + "\n", + "\"Parallelism\"\n", + "\n", + "## CIFAR10 Dataset Overview\n", + "\n", + "The CIFAR-10 dataset consists of 60000 32x32 colour images in 10 classes, with 6000 images per class. There are 50000 training images and 10000 test images.\n", + "\n", + "![CIFAR10 Dataset](https://storage.googleapis.com/kaggle-competitions/kaggle/3649/media/cifar-10.png)\n", + "\n", + "More info: https://www.cs.toronto.edu/~kriz/cifar.html" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "from __future__ import absolute_import, division, print_function\n", + "\n", + "import tensorflow as tf\n", + "from tensorflow.keras import Model, layers\n", + "import time\n", + "import numpy as np" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "# MNIST dataset parameters.\n", + "num_classes = 10 # total classes (0-9 digits).\n", + "num_gpus = 4\n", + "\n", + "# Training parameters.\n", + "learning_rate = 0.001\n", + "training_steps = 1000\n", + "# Split batch size equally between GPUs.\n", + "# Note: Reduce batch size if you encounter OOM Errors.\n", + "batch_size = 1024 * num_gpus\n", + "display_step = 20\n", + "\n", + "# Network parameters.\n", + "conv1_filters = 64 # number of filters for 1st conv layer.\n", + "conv2_filters = 128 # number of filters for 2nd conv layer.\n", + "conv3_filters = 256 # number of filters for 2nd conv layer.\n", + "fc1_units = 2048 # number of neurons for 1st fully-connected layer." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "# Prepare MNIST data.\n", + "from tensorflow.keras.datasets import cifar10\n", + "(x_train, y_train), (x_test, y_test) = cifar10.load_data()\n", + "# Convert to float32.\n", + "x_train, x_test = np.array(x_train, np.float32), np.array(x_test, np.float32)\n", + "# Normalize images value from [0, 255] to [0, 1].\n", + "x_train, x_test = x_train / 255., x_test / 255.\n", + "y_train, y_test = np.reshape(y_train, (-1)), np.reshape(y_test, (-1))" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "# Use tf.data API to shuffle and batch data.\n", + "train_data = tf.data.Dataset.from_tensor_slices((x_train, y_train))\n", + "train_data = train_data.repeat().shuffle(batch_size * 10).batch(batch_size).prefetch(num_gpus)" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "class ConvNet(Model):\n", + " # Set layers.\n", + " def __init__(self):\n", + " super(ConvNet, self).__init__()\n", + " \n", + " # Convolution Layer with 64 filters and a kernel size of 3.\n", + " self.conv1_1 = layers.Conv2D(conv1_filters, kernel_size=3, padding='SAME', activation=tf.nn.relu)\n", + " self.conv1_2 = layers.Conv2D(conv1_filters, kernel_size=3, padding='SAME', activation=tf.nn.relu)\n", + " # Max Pooling (down-sampling) with kernel size of 2 and strides of 2. \n", + " self.maxpool1 = layers.MaxPool2D(2, strides=2)\n", + "\n", + " # Convolution Layer with 128 filters and a kernel size of 3.\n", + " self.conv2_1 = layers.Conv2D(conv2_filters, kernel_size=3, padding='SAME', activation=tf.nn.relu)\n", + " self.conv2_2 = layers.Conv2D(conv2_filters, kernel_size=3, padding='SAME', activation=tf.nn.relu)\n", + " self.conv2_3 = layers.Conv2D(conv2_filters, kernel_size=3, padding='SAME', activation=tf.nn.relu)\n", + " # Max Pooling (down-sampling) with kernel size of 2 and strides of 2. \n", + " self.maxpool2 = layers.MaxPool2D(2, strides=2)\n", + "\n", + " # Convolution Layer with 256 filters and a kernel size of 3.\n", + " self.conv3_1 = layers.Conv2D(conv3_filters, kernel_size=3, padding='SAME', activation=tf.nn.relu)\n", + " self.conv3_2 = layers.Conv2D(conv3_filters, kernel_size=3, padding='SAME', activation=tf.nn.relu)\n", + " self.conv3_3 = layers.Conv2D(conv3_filters, kernel_size=3, padding='SAME', activation=tf.nn.relu)\n", + "\n", + " # Flatten the data to a 1-D vector for the fully connected layer.\n", + " self.flatten = layers.Flatten()\n", + "\n", + " # Fully connected layer.\n", + " self.fc1 = layers.Dense(1024, activation=tf.nn.relu)\n", + " # Apply Dropout (if is_training is False, dropout is not applied).\n", + " self.dropout = layers.Dropout(rate=0.5)\n", + "\n", + " # Output layer, class prediction.\n", + " self.out = layers.Dense(num_classes)\n", + "\n", + " # Set forward pass.\n", + " @tf.function\n", + " def call(self, x, is_training=False):\n", + " x = self.conv1_1(x)\n", + " x = self.conv1_2(x)\n", + " x = self.maxpool1(x)\n", + " x = self.conv2_1(x)\n", + " x = self.conv2_2(x)\n", + " x = self.conv2_3(x)\n", + " x = self.maxpool2(x)\n", + " x = self.conv3_1(x)\n", + " x = self.conv3_2(x)\n", + " x = self.conv3_3(x)\n", + " x = self.flatten(x)\n", + " x = self.fc1(x)\n", + " x = self.dropout(x, training=is_training)\n", + " x = self.out(x)\n", + " if not is_training:\n", + " # tf cross entropy expect logits without softmax, so only\n", + " # apply softmax when not training.\n", + " x = tf.nn.softmax(x)\n", + " return x" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "# Cross-Entropy Loss.\n", + "# Note that this will apply 'softmax' to the logits.\n", + "@tf.function\n", + "def cross_entropy_loss(x, y):\n", + " # Convert labels to int 64 for tf cross-entropy function.\n", + " y = tf.cast(y, tf.int64)\n", + " # Apply softmax to logits and compute cross-entropy.\n", + " loss = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=x)\n", + " # Average loss across the batch.\n", + " return tf.reduce_mean(loss)\n", + "\n", + "# Accuracy metric.\n", + "@tf.function\n", + "def accuracy(y_pred, y_true):\n", + " # Predicted class is the index of highest score in prediction vector (i.e. argmax).\n", + " correct_prediction = tf.equal(tf.argmax(y_pred, 1), tf.cast(y_true, tf.int64))\n", + " return tf.reduce_mean(tf.cast(correct_prediction, tf.float32), axis=-1)\n", + " \n", + "\n", + "@tf.function\n", + "def backprop(batch_x, batch_y, trainable_variables):\n", + " # Wrap computation inside a GradientTape for automatic differentiation.\n", + " with tf.GradientTape() as g:\n", + " # Forward pass.\n", + " pred = conv_net(batch_x, is_training=True)\n", + " # Compute loss.\n", + " loss = cross_entropy_loss(pred, batch_y)\n", + " # Compute gradients.\n", + " gradients = g.gradient(loss, trainable_variables)\n", + " return gradients\n", + "\n", + "# Build the function to average the gradients.\n", + "@tf.function\n", + "def average_gradients(tower_grads):\n", + " avg_grads = []\n", + " for tgrads in zip(*tower_grads):\n", + " grads = []\n", + " for g in tgrads:\n", + " expanded_g = tf.expand_dims(g, 0)\n", + " grads.append(expanded_g)\n", + " \n", + " grad = tf.concat(axis=0, values=grads)\n", + " grad = tf.reduce_mean(grad, 0)\n", + " \n", + " avg_grads.append(grad)\n", + " \n", + " return avg_grads" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "with tf.device('/cpu:0'):\n", + " # Build convnet.\n", + " conv_net = ConvNet()\n", + " # Stochastic gradient descent optimizer.\n", + " optimizer = tf.optimizers.Adam(learning_rate)" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "# Optimization process.\n", + "def run_optimization(x, y):\n", + " # Save gradients for all GPUs.\n", + " tower_grads = []\n", + " # Variables to update, i.e. trainable variables.\n", + " trainable_variables = conv_net.trainable_variables\n", + "\n", + " with tf.device('/cpu:0'):\n", + " for i in range(num_gpus):\n", + " # Split data between GPUs.\n", + " gpu_batch_size = int(batch_size/num_gpus)\n", + " batch_x = x[i * gpu_batch_size: (i+1) * gpu_batch_size]\n", + " batch_y = y[i * gpu_batch_size: (i+1) * gpu_batch_size]\n", + " \n", + " # Build the neural net on each GPU.\n", + " with tf.device('/gpu:%i' % i):\n", + " grad = backprop(batch_x, batch_y, trainable_variables)\n", + " tower_grads.append(grad)\n", + " \n", + " # Last GPU Average gradients from all GPUs.\n", + " if i == num_gpus - 1:\n", + " gradients = average_gradients(tower_grads)\n", + "\n", + " # Update vars following gradients.\n", + " optimizer.apply_gradients(zip(gradients, trainable_variables))" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "step: 1, loss: 2.302630, accuracy: 0.101318, speed: 16342.138481 examples/sec\n", + "step: 20, loss: 2.296755, accuracy: 0.108398, speed: 5355.197204 examples/sec\n", + "step: 40, loss: 2.216037, accuracy: 0.299072, speed: 12388.080848 examples/sec\n", + "step: 60, loss: 2.189814, accuracy: 0.362305, speed: 12033.404638 examples/sec\n", + "step: 80, loss: 2.137831, accuracy: 0.410156, speed: 12189.852065 examples/sec\n", + "step: 100, loss: 2.102876, accuracy: 0.437744, speed: 12212.349483 examples/sec\n", + "step: 120, loss: 2.077521, accuracy: 0.460693, speed: 12160.290400 examples/sec\n", + "step: 140, loss: 2.006775, accuracy: 0.545166, speed: 12202.175380 examples/sec\n", + "step: 160, loss: 1.994143, accuracy: 0.554443, speed: 12168.070368 examples/sec\n", + "step: 180, loss: 1.964281, accuracy: 0.597412, speed: 12244.148312 examples/sec\n", + "step: 200, loss: 1.893395, accuracy: 0.658203, speed: 12197.382402 examples/sec\n", + "step: 220, loss: 1.880256, accuracy: 0.672363, speed: 12178.323620 examples/sec\n", + "step: 240, loss: 1.868853, accuracy: 0.676025, speed: 12224.851444 examples/sec\n", + "step: 260, loss: 1.837151, accuracy: 0.705322, speed: 12101.154436 examples/sec\n", + "step: 280, loss: 1.799418, accuracy: 0.736816, speed: 12185.701420 examples/sec\n", + "step: 300, loss: 1.790719, accuracy: 0.755615, speed: 12126.826668 examples/sec\n", + "step: 320, loss: 1.732242, accuracy: 0.807861, speed: 12229.926783 examples/sec\n", + "step: 340, loss: 1.732089, accuracy: 0.806885, speed: 12167.651100 examples/sec\n", + "step: 360, loss: 1.693968, accuracy: 0.835693, speed: 12060.687471 examples/sec\n", + "step: 380, loss: 1.665804, accuracy: 0.862305, speed: 12130.389108 examples/sec\n", + "step: 400, loss: 1.627162, accuracy: 0.890381, speed: 12152.946766 examples/sec\n", + "step: 420, loss: 1.594189, accuracy: 0.920654, speed: 12057.401941 examples/sec\n", + "step: 440, loss: 1.575212, accuracy: 0.929688, speed: 12196.589206 examples/sec\n", + "step: 460, loss: 1.569351, accuracy: 0.942383, speed: 12147.345871 examples/sec\n", + "step: 480, loss: 1.520648, accuracy: 0.974609, speed: 11998.473978 examples/sec\n", + "step: 500, loss: 1.507439, accuracy: 0.982666, speed: 12152.490287 examples/sec\n", + "step: 520, loss: 1.495090, accuracy: 0.989746, speed: 12071.718912 examples/sec\n", + "step: 540, loss: 1.490940, accuracy: 0.989502, speed: 12049.224039 examples/sec\n", + "step: 560, loss: 1.476727, accuracy: 0.996338, speed: 12134.827424 examples/sec\n", + "step: 580, loss: 1.475038, accuracy: 0.995850, speed: 12128.228532 examples/sec\n", + "step: 600, loss: 1.469776, accuracy: 0.997559, speed: 12113.386949 examples/sec\n", + "step: 620, loss: 1.466832, accuracy: 0.999756, speed: 11939.016031 examples/sec\n", + "step: 640, loss: 1.466991, accuracy: 0.999023, speed: 12095.815773 examples/sec\n", + "step: 660, loss: 1.466177, accuracy: 0.999023, speed: 12035.037908 examples/sec\n", + "step: 680, loss: 1.465074, accuracy: 0.999512, speed: 11789.118097 examples/sec\n", + "step: 700, loss: 1.464655, accuracy: 0.999512, speed: 11965.087437 examples/sec\n", + "step: 720, loss: 1.465109, accuracy: 0.999512, speed: 11855.853520 examples/sec\n", + "step: 740, loss: 1.465021, accuracy: 0.999023, speed: 11774.901096 examples/sec\n", + "step: 760, loss: 1.463057, accuracy: 1.000000, speed: 11930.138289 examples/sec\n", + "step: 780, loss: 1.462609, accuracy: 1.000000, speed: 11766.752011 examples/sec\n", + "step: 800, loss: 1.462320, accuracy: 0.999756, speed: 11744.213314 examples/sec\n", + "step: 820, loss: 1.462975, accuracy: 1.000000, speed: 11700.815885 examples/sec\n", + "step: 840, loss: 1.462328, accuracy: 1.000000, speed: 11759.141371 examples/sec\n", + "step: 860, loss: 1.462561, accuracy: 1.000000, speed: 11650.397252 examples/sec\n", + "step: 880, loss: 1.462608, accuracy: 0.999512, speed: 11581.170575 examples/sec\n", + "step: 900, loss: 1.462178, accuracy: 0.999756, speed: 11562.545711 examples/sec\n", + "step: 920, loss: 1.461582, accuracy: 1.000000, speed: 11616.172231 examples/sec\n", + "step: 940, loss: 1.462402, accuracy: 1.000000, speed: 11709.561795 examples/sec\n", + "step: 960, loss: 1.462436, accuracy: 1.000000, speed: 11629.547741 examples/sec\n", + "step: 980, loss: 1.462415, accuracy: 1.000000, speed: 11623.658645 examples/sec\n", + "step: 1000, loss: 1.461925, accuracy: 1.000000, speed: 11579.716701 examples/sec\n" + ] + } + ], + "source": [ + "# Run training for the given number of steps.\n", + "ts = time.time()\n", + "for step, (batch_x, batch_y) in enumerate(train_data.take(training_steps), 1):\n", + " # Run the optimization to update W and b values.\n", + " run_optimization(batch_x, batch_y)\n", + " \n", + " if step % display_step == 0 or step == 1:\n", + " dt = time.time() - ts\n", + " speed = batch_size * display_step / dt\n", + " pred = conv_net(batch_x)\n", + " loss = cross_entropy_loss(pred, batch_y)\n", + " acc = accuracy(pred, batch_y)\n", + " print(\"step: %i, loss: %f, accuracy: %f, speed: %f examples/sec\" % (step, loss, acc, speed))\n", + " ts = time.time()" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 2", + "language": "python", + "name": "python2" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 2 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython2", + "version": "2.7.18" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +}